Mar 20 17:17:35 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 17:17:35 crc restorecon[4701]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:35 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:17:36 crc restorecon[4701]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 17:17:36 crc kubenswrapper[4795]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:17:36 crc kubenswrapper[4795]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 17:17:36 crc kubenswrapper[4795]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:17:36 crc kubenswrapper[4795]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:17:36 crc kubenswrapper[4795]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 17:17:36 crc kubenswrapper[4795]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.980809 4795 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995035 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995092 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995103 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995112 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995121 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995132 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995143 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995152 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995160 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995170 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995178 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995187 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995195 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995205 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995213 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995222 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995234 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995246 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995255 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995264 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995274 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995283 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995292 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995300 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995307 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995318 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995328 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995343 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995351 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995359 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995367 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995378 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995386 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995395 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995404 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995411 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995419 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995427 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995435 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995444 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995452 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995460 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995467 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995475 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995485 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995497 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995506 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995515 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995524 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995533 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995541 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995549 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995558 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995566 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995574 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995582 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995590 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995598 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995606 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995613 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995621 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995630 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995640 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995649 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995658 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995670 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995705 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995717 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995727 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995737 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:17:36 crc kubenswrapper[4795]: W0320 17:17:36.995747 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.995962 4795 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.995988 4795 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996007 4795 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996022 4795 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996038 4795 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996050 4795 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996066 4795 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996082 4795 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996095 4795 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996108 4795 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996120 4795 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996135 4795 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996150 4795 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996162 4795 flags.go:64] FLAG: --cgroup-root="" Mar 20 17:17:36 crc kubenswrapper[4795]: I0320 17:17:36.996174 4795 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996185 4795 flags.go:64] FLAG: --client-ca-file="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996197 4795 flags.go:64] FLAG: --cloud-config="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996211 4795 flags.go:64] FLAG: --cloud-provider="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996223 4795 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996239 4795 flags.go:64] FLAG: --cluster-domain="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996251 4795 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996263 4795 flags.go:64] FLAG: --config-dir="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996276 4795 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996290 4795 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996320 4795 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996332 4795 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996345 4795 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996358 4795 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996372 4795 flags.go:64] FLAG: --contention-profiling="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996385 4795 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996398 4795 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996410 4795 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996424 4795 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996464 4795 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996476 4795 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996488 4795 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996500 4795 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996512 4795 flags.go:64] FLAG: --enable-server="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996524 4795 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996536 4795 flags.go:64] FLAG: --event-burst="100" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996547 4795 flags.go:64] FLAG: --event-qps="50" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996556 4795 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996565 4795 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996574 4795 flags.go:64] FLAG: --eviction-hard="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996589 4795 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996599 4795 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996608 4795 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996619 4795 flags.go:64] FLAG: --eviction-soft="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996630 4795 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996639 4795 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996649 4795 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996658 4795 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996667 4795 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996676 4795 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996715 4795 flags.go:64] FLAG: --feature-gates="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996737 4795 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996747 4795 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996756 4795 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996766 4795 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996777 4795 flags.go:64] FLAG: --healthz-port="10248" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996787 4795 flags.go:64] FLAG: --help="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996796 4795 flags.go:64] FLAG: --hostname-override="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996805 4795 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996815 4795 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996827 4795 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996836 4795 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996845 4795 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996854 4795 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996863 4795 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996873 4795 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996882 4795 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996891 4795 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996901 4795 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996910 4795 flags.go:64] FLAG: --kube-reserved="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996919 4795 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996928 4795 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996938 4795 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996947 4795 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996957 4795 flags.go:64] FLAG: --lock-file="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996966 4795 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996975 4795 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996984 4795 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.996998 4795 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997009 4795 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997018 4795 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997027 4795 flags.go:64] FLAG: --logging-format="text" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997039 4795 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997054 4795 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997079 4795 flags.go:64] FLAG: --manifest-url="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997092 4795 flags.go:64] FLAG: --manifest-url-header="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997109 4795 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997121 4795 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997136 4795 flags.go:64] FLAG: --max-pods="110" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997148 4795 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997160 4795 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997170 4795 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997179 4795 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997188 4795 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997198 4795 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997207 4795 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997232 4795 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997241 4795 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997251 4795 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997261 4795 flags.go:64] FLAG: --pod-cidr="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997269 4795 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997285 4795 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997293 4795 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997303 4795 flags.go:64] FLAG: --pods-per-core="0" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997312 4795 flags.go:64] FLAG: --port="10250" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997322 4795 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997331 4795 flags.go:64] FLAG: --provider-id="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997340 4795 flags.go:64] FLAG: --qos-reserved="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997349 4795 flags.go:64] FLAG: --read-only-port="10255" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997359 4795 flags.go:64] FLAG: --register-node="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997368 4795 flags.go:64] FLAG: --register-schedulable="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997377 4795 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997396 4795 flags.go:64] FLAG: --registry-burst="10" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997405 4795 flags.go:64] FLAG: --registry-qps="5" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997413 4795 flags.go:64] FLAG: --reserved-cpus="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997424 4795 flags.go:64] FLAG: --reserved-memory="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997437 4795 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997446 4795 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997456 4795 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997465 4795 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997474 4795 flags.go:64] FLAG: --runonce="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997483 4795 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997492 4795 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997503 4795 flags.go:64] FLAG: --seccomp-default="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997512 4795 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997522 4795 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997532 4795 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997542 4795 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997551 4795 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997561 4795 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997570 4795 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997580 4795 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997590 4795 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997600 4795 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997610 4795 flags.go:64] FLAG: --system-cgroups="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997619 4795 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997635 4795 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997644 4795 flags.go:64] FLAG: --tls-cert-file="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997653 4795 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997665 4795 flags.go:64] FLAG: --tls-min-version="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997675 4795 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997714 4795 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997724 4795 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997734 4795 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997743 4795 flags.go:64] FLAG: --v="2" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997756 4795 flags.go:64] FLAG: --version="false" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997768 4795 flags.go:64] FLAG: --vmodule="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997779 4795 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.997789 4795 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998059 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998076 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998092 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998102 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998113 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998124 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998135 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998145 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998155 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998165 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998175 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998184 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998194 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998209 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998222 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998235 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998247 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998257 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998274 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998284 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998296 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998307 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998317 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998328 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998337 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998345 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998353 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998361 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998372 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998382 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998391 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998400 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998410 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998419 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998429 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998440 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998451 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998461 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998476 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998486 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998497 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998508 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998519 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998528 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998538 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998548 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998559 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998569 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998579 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998589 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998605 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998616 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998628 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998641 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998653 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998664 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998674 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998742 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998754 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998764 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998774 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998783 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998793 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998802 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998811 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998821 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998830 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998844 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998857 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998869 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:36.998881 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:36.998899 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.013458 4795 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.013790 4795 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013904 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013921 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013927 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013934 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013940 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013946 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013951 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013957 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013962 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013968 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013973 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013979 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013985 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013992 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.013999 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014006 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014013 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014019 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014026 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014033 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014039 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014044 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014050 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014055 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014060 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014088 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014095 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014101 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014107 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014113 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014119 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014124 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014130 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014136 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014143 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014148 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014153 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014158 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014164 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014169 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014174 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014181 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014188 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014195 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014201 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014207 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014213 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014219 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014225 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014231 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014236 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014243 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014248 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014255 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014260 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014265 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014270 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014275 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014280 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014285 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014290 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014295 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014299 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014304 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014309 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014315 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014322 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014329 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014335 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014341 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014347 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.014357 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014525 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014532 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014538 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014544 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014549 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014554 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014559 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014564 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014569 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014574 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014579 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014585 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014591 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014596 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014601 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014608 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014614 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014620 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014626 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014631 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014636 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014641 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014647 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014652 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014657 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014664 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014670 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014712 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014723 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014730 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014737 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014742 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014748 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014753 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014765 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014771 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014777 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014782 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014787 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014792 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014797 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014803 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014808 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014814 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014824 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014830 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014836 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014841 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014846 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014851 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014856 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014861 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014866 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014871 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014877 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014882 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014887 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014892 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014897 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014902 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014907 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014912 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014918 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014923 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014929 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014934 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014939 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014945 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014950 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014956 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.014963 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.014972 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.015193 4795 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.020098 4795 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.025085 4795 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.025255 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.027544 4795 server.go:997] "Starting client certificate rotation" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.027584 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.027823 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.055023 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.059015 4795 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.060626 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.081056 4795 log.go:25] "Validated CRI v1 runtime API" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.121435 4795 log.go:25] "Validated CRI v1 image API" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.124110 4795 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.130974 4795 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-17-11-54-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.131012 4795 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.156774 4795 manager.go:217] Machine: {Timestamp:2026-03-20 17:17:37.151879249 +0000 UTC m=+0.609910830 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:14ef5e9e-707f-4ad8-89b5-1abff10c4fa0 BootID:55dab564-f3ba-4083-bf1a-aa261eb80746 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:44:1e:48 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:44:1e:48 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b3:f7:a7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:77:09:e9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ae:da:c1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c8:5b:b6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c6:65:4e:b0:37:b4 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ea:58:7d:53:27:ff Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.157065 4795 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.157303 4795 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.158678 4795 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.158907 4795 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.158952 4795 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.161633 4795 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.161663 4795 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.162122 4795 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.162161 4795 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.162428 4795 state_mem.go:36] "Initialized new in-memory state store" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.162534 4795 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.166268 4795 kubelet.go:418] "Attempting to sync node with API server" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.166302 4795 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.166328 4795 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.166348 4795 kubelet.go:324] "Adding apiserver pod source" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.166372 4795 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.171407 4795 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.172151 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.172253 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.172279 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.172436 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.172455 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.174987 4795 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176660 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176715 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176730 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176743 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176761 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176773 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176785 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176801 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176813 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176825 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176858 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.176876 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.178628 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.179199 4795 server.go:1280] "Started kubelet" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.179306 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.181312 4795 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.181276 4795 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.183099 4795 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 17:17:37 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.185061 4795 server.go:460] "Adding debug handlers to kubelet server" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.185425 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.185518 4795 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.185647 4795 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.185707 4795 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.185673 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.186886 4795 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.187616 4795 factory.go:55] Registering systemd factory Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.187639 4795 factory.go:221] Registration of the systemd container factory successfully Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.199341 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="200ms" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.199478 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.200076 4795 factory.go:153] Registering CRI-O factory Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.200182 4795 factory.go:221] Registration of the crio container factory successfully Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.200280 4795 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.200321 4795 factory.go:103] Registering Raw factory Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.200348 4795 manager.go:1196] Started watching for new ooms in manager Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.200528 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.198847 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e9c35951873db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,LastTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.201350 4795 manager.go:319] Starting recovery of all containers Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.210897 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.210981 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211005 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211020 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211041 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211063 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211086 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211109 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211129 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211145 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211162 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211179 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211195 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211214 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211237 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211253 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211271 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211288 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211304 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211319 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211335 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211352 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211369 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211388 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211405 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211421 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211442 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211464 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211482 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211499 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211530 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211550 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211568 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211624 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.211644 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213222 4795 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213266 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213292 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213314 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213334 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213354 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213376 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213393 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213415 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213432 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213450 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213469 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213487 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213511 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213528 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213545 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213567 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213585 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213616 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213636 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213657 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213676 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213722 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213745 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213779 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213798 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213817 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213838 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213884 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213903 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213920 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213938 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213954 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213973 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.213992 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214011 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214033 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214053 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214071 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214090 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214109 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214128 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214148 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214166 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214186 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214207 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214225 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214248 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214268 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214286 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214305 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214326 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214345 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214363 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214388 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214406 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214429 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214449 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214468 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214490 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214509 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214527 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214545 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214563 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214621 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214644 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214662 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214708 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214730 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214749 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214775 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214794 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214816 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214836 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214856 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214877 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214899 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214920 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214942 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214963 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.214985 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215006 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215028 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215050 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215071 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215093 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215115 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215136 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215156 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215176 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215196 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215217 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215236 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215260 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215284 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215305 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215327 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215347 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215368 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215388 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215408 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215429 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215450 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215475 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215496 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215515 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215535 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215554 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215576 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215600 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215622 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215646 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215666 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215747 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215772 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215795 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215814 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215835 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215855 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215875 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215895 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215921 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215961 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.215981 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216003 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216029 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216052 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216071 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216092 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216114 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216135 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216156 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216175 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216202 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216227 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216247 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216266 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216286 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216308 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216328 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216347 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216374 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216396 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216418 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216439 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216460 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216479 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216502 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216524 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216545 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216567 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216588 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216609 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216630 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216648 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216668 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216714 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216739 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216757 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216779 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216800 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216821 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216843 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216862 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216884 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216909 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216930 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216949 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216968 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.216989 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.217010 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.217030 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.217052 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.217073 4795 reconstruct.go:97] "Volume reconstruction finished" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.217087 4795 reconciler.go:26] "Reconciler: start to sync state" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.236127 4795 manager.go:324] Recovery completed Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.247799 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.248195 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.250298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.250347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.250356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.250789 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.250834 4795 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.250868 4795 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.250922 4795 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.251752 4795 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.251773 4795 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.251795 4795 state_mem.go:36] "Initialized new in-memory state store" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.252036 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.252095 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.275498 4795 policy_none.go:49] "None policy: Start" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.276153 4795 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.276180 4795 state_mem.go:35] "Initializing new in-memory state store" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.287450 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.345213 4795 manager.go:334] "Starting Device Plugin manager" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.345279 4795 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.345296 4795 server.go:79] "Starting device plugin registration server" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.345960 4795 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.345986 4795 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.346523 4795 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.346662 4795 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.346674 4795 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.350987 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.351092 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.352509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.352548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.352569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.352853 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.353439 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.353479 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.353907 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.354590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.354666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.354706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.355371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.355416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.355427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.355552 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.356055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.356099 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360630 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.360873 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.363783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.363823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.363785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.363839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.363859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.363942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.364226 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.364346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.364402 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.365864 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.366942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.366979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.366992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.401620 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="400ms" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.419949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420097 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420122 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420168 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420768 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.420845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.446545 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.447938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.447999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.448017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.448056 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.448658 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522149 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522615 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522381 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522869 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522892 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.522914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.649178 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.651115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.651214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.651228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.651277 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.652039 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.710344 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.729195 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.741974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.752084 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9015642e1ccc851543fa8a86779b259228366f01973c434db0f8238c1465d7d5 WatchSource:0}: Error finding container 9015642e1ccc851543fa8a86779b259228366f01973c434db0f8238c1465d7d5: Status 404 returned error can't find the container with id 9015642e1ccc851543fa8a86779b259228366f01973c434db0f8238c1465d7d5 Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.766390 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-28af6739aeaaf6123c1051459f2a075d3392440c144495789df1642c8c603b1b WatchSource:0}: Error finding container 28af6739aeaaf6123c1051459f2a075d3392440c144495789df1642c8c603b1b: Status 404 returned error can't find the container with id 28af6739aeaaf6123c1051459f2a075d3392440c144495789df1642c8c603b1b Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.775295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.777130 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7a27d89fc942218a01187cb5716bc79f5329f97a6e7f9d32b7804afa7069f0db WatchSource:0}: Error finding container 7a27d89fc942218a01187cb5716bc79f5329f97a6e7f9d32b7804afa7069f0db: Status 404 returned error can't find the container with id 7a27d89fc942218a01187cb5716bc79f5329f97a6e7f9d32b7804afa7069f0db Mar 20 17:17:37 crc kubenswrapper[4795]: I0320 17:17:37.783167 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.803344 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-ab154851a262797560adc1fc3c6ca2e5f375afa8b1bf666da617e8bcc65a1380 WatchSource:0}: Error finding container ab154851a262797560adc1fc3c6ca2e5f375afa8b1bf666da617e8bcc65a1380: Status 404 returned error can't find the container with id ab154851a262797560adc1fc3c6ca2e5f375afa8b1bf666da617e8bcc65a1380 Mar 20 17:17:37 crc kubenswrapper[4795]: E0320 17:17:37.803474 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="800ms" Mar 20 17:17:37 crc kubenswrapper[4795]: W0320 17:17:37.806337 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c6aac87efe1b16991d5e1177c7efbe84754f64875072a6931be21b36da681b22 WatchSource:0}: Error finding container c6aac87efe1b16991d5e1177c7efbe84754f64875072a6931be21b36da681b22: Status 404 returned error can't find the container with id c6aac87efe1b16991d5e1177c7efbe84754f64875072a6931be21b36da681b22 Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.052416 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.055055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.055118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.055133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.055172 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.055589 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.181106 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:38 crc kubenswrapper[4795]: W0320 17:17:38.209195 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.209340 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.256123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7a27d89fc942218a01187cb5716bc79f5329f97a6e7f9d32b7804afa7069f0db"} Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.257592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"28af6739aeaaf6123c1051459f2a075d3392440c144495789df1642c8c603b1b"} Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.259034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9015642e1ccc851543fa8a86779b259228366f01973c434db0f8238c1465d7d5"} Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.262030 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c6aac87efe1b16991d5e1177c7efbe84754f64875072a6931be21b36da681b22"} Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.270719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ab154851a262797560adc1fc3c6ca2e5f375afa8b1bf666da617e8bcc65a1380"} Mar 20 17:17:38 crc kubenswrapper[4795]: W0320 17:17:38.336819 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.336929 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.443518 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e9c35951873db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,LastTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:38 crc kubenswrapper[4795]: W0320 17:17:38.479726 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.479915 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.605290 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="1.6s" Mar 20 17:17:38 crc kubenswrapper[4795]: W0320 17:17:38.684603 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.684749 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.855836 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.857899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.857957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.857974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:38 crc kubenswrapper[4795]: I0320 17:17:38.858040 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:38 crc kubenswrapper[4795]: E0320 17:17:38.858681 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.163337 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:17:39 crc kubenswrapper[4795]: E0320 17:17:39.165026 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.180352 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.276069 4795 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d" exitCode=0 Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.276180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d"} Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.276262 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.278850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.278918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.278942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.281421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b"} Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.281506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76"} Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.281531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950"} Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.284337 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782" exitCode=0 Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.284444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782"} Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.284738 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.286448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.286490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.286507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.288537 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317" exitCode=0 Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.288662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317"} Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.288743 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.290301 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.290481 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.290529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.290551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.294576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.294623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.294858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.296157 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d" exitCode=0 Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.296215 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d"} Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.296294 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.297502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.297542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:39 crc kubenswrapper[4795]: I0320 17:17:39.297560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:39 crc kubenswrapper[4795]: W0320 17:17:39.977011 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:39 crc kubenswrapper[4795]: E0320 17:17:39.977482 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.180288 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:40 crc kubenswrapper[4795]: E0320 17:17:40.206258 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="3.2s" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.302021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4"} Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.302140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6"} Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.308334 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.308324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654"} Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.309366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.309414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.309433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.312124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018"} Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.312172 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080"} Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.315153 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d" exitCode=0 Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.315231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d"} Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.315289 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.316403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.316441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.316454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.320145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b"} Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.320305 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.321811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.321837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.321848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.380156 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.440155 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.459412 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.460848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.460889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.460903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4795]: I0320 17:17:40.460929 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:40 crc kubenswrapper[4795]: E0320 17:17:40.461395 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.58:6443: connect: connection refused" node="crc" Mar 20 17:17:40 crc kubenswrapper[4795]: W0320 17:17:40.466959 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:40 crc kubenswrapper[4795]: E0320 17:17:40.467029 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.180598 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:41 crc kubenswrapper[4795]: W0320 17:17:41.197151 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.58:6443: connect: connection refused Mar 20 17:17:41 crc kubenswrapper[4795]: E0320 17:17:41.197225 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.58:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.327129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ef025b52e751ef7d3dae0b76a4e9a8714a60988166187d111c22c8e48b09131"} Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.327204 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96"} Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.327226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94"} Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.327401 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.328499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.328544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.328562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.331559 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d" exitCode=0 Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.331633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d"} Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.331805 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.332752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.332788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.332804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.336928 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3"} Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.337020 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.337399 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.337023 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.338310 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.338352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.338372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.338897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.339076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.339193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.340135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.340288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4795]: I0320 17:17:41.340509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5"} Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6"} Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6"} Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343293 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343373 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343416 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343917 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.343980 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.344426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.344438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.344457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.344460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.344467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.344473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.344961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.345002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.345013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.955475 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:42 crc kubenswrapper[4795]: I0320 17:17:42.971759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.353167 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca"} Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.353225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895"} Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.353267 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.353315 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.353489 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.355124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.355150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.355176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.355178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.355200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.355208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.356333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.356382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.356402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.378850 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.440622 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.440758 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.662002 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.664010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.664067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.664087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.664124 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.914599 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.914915 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.916529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.916636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4795]: I0320 17:17:43.916658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.356073 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.357778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.357857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.357882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.401512 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.401781 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.403534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.403606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4795]: I0320 17:17:44.403619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4795]: I0320 17:17:45.928521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:45 crc kubenswrapper[4795]: I0320 17:17:45.928721 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:45 crc kubenswrapper[4795]: I0320 17:17:45.930534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4795]: I0320 17:17:45.930582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4795]: I0320 17:17:45.930597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.295268 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.295506 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.297378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.297439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.297463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.722312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.722603 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.724306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.724392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4795]: I0320 17:17:46.724410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4795]: E0320 17:17:47.354076 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:17:48 crc kubenswrapper[4795]: I0320 17:17:48.386730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:48 crc kubenswrapper[4795]: I0320 17:17:48.387017 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:48 crc kubenswrapper[4795]: I0320 17:17:48.390534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4795]: I0320 17:17:48.390625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4795]: I0320 17:17:48.390654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4795]: I0320 17:17:48.396850 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:49 crc kubenswrapper[4795]: I0320 17:17:49.369098 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:49 crc kubenswrapper[4795]: I0320 17:17:49.370847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4795]: I0320 17:17:49.370963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4795]: I0320 17:17:49.370984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4795]: I0320 17:17:49.376888 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:50 crc kubenswrapper[4795]: I0320 17:17:50.372045 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:50 crc kubenswrapper[4795]: I0320 17:17:50.373440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4795]: I0320 17:17:50.373502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4795]: I0320 17:17:50.373524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4795]: W0320 17:17:51.756552 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 17:17:51 crc kubenswrapper[4795]: I0320 17:17:51.757319 4795 trace.go:236] Trace[1085729732]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 17:17:41.754) (total time: 10002ms): Mar 20 17:17:51 crc kubenswrapper[4795]: Trace[1085729732]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:17:51.756) Mar 20 17:17:51 crc kubenswrapper[4795]: Trace[1085729732]: [10.002415149s] [10.002415149s] END Mar 20 17:17:51 crc kubenswrapper[4795]: E0320 17:17:51.757361 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.074754 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z Mar 20 17:17:52 crc kubenswrapper[4795]: W0320 17:17:52.079438 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z Mar 20 17:17:52 crc kubenswrapper[4795]: E0320 17:17:52.079505 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4795]: E0320 17:17:52.081038 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 17:17:52 crc kubenswrapper[4795]: E0320 17:17:52.082654 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4795]: W0320 17:17:52.083085 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z Mar 20 17:17:52 crc kubenswrapper[4795]: E0320 17:17:52.083166 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4795]: E0320 17:17:52.084545 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 17:17:52 crc kubenswrapper[4795]: W0320 17:17:52.087902 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z Mar 20 17:17:52 crc kubenswrapper[4795]: E0320 17:17:52.087949 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.088574 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.088642 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 17:17:52 crc kubenswrapper[4795]: E0320 17:17:52.089572 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9c35951873db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,LastTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.093189 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.093249 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.172295 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.172371 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.182116 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:52Z is after 2026-02-23T05:33:13Z Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.381494 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.384295 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ef025b52e751ef7d3dae0b76a4e9a8714a60988166187d111c22c8e48b09131" exitCode=255 Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.384351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2ef025b52e751ef7d3dae0b76a4e9a8714a60988166187d111c22c8e48b09131"} Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.384550 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.385752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.385819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.385839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4795]: I0320 17:17:52.386783 4795 scope.go:117] "RemoveContainer" containerID="2ef025b52e751ef7d3dae0b76a4e9a8714a60988166187d111c22c8e48b09131" Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.183821 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:53Z is after 2026-02-23T05:33:13Z Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.389070 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.391047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9"} Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.391197 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.395596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.395672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.395706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.441062 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" start-of-body= Mar 20 17:17:53 crc kubenswrapper[4795]: I0320 17:17:53.441150 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.184849 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:54Z is after 2026-02-23T05:33:13Z Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.397802 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.398652 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.401479 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" exitCode=255 Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.401554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9"} Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.401664 4795 scope.go:117] "RemoveContainer" containerID="2ef025b52e751ef7d3dae0b76a4e9a8714a60988166187d111c22c8e48b09131" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.401890 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.403278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.403331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.403349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4795]: I0320 17:17:54.404267 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:17:54 crc kubenswrapper[4795]: E0320 17:17:54.404569 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.184853 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:55Z is after 2026-02-23T05:33:13Z Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.406847 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.947484 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.947756 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.949388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.949460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.949488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.950416 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:17:55 crc kubenswrapper[4795]: E0320 17:17:55.950901 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:17:55 crc kubenswrapper[4795]: I0320 17:17:55.955070 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.185759 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:56Z is after 2026-02-23T05:33:13Z Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.337591 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.337871 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.339425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.339478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.339498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.358941 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.413189 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.413301 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.414717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.414777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.414803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.414853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.414885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.414895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4795]: I0320 17:17:56.415681 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:17:56 crc kubenswrapper[4795]: E0320 17:17:56.416005 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:17:56 crc kubenswrapper[4795]: W0320 17:17:56.742649 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:56Z is after 2026-02-23T05:33:13Z Mar 20 17:17:56 crc kubenswrapper[4795]: E0320 17:17:56.742780 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:17:57 crc kubenswrapper[4795]: I0320 17:17:57.184751 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:57Z is after 2026-02-23T05:33:13Z Mar 20 17:17:57 crc kubenswrapper[4795]: E0320 17:17:57.354259 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:17:58 crc kubenswrapper[4795]: I0320 17:17:58.185719 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:58 crc kubenswrapper[4795]: I0320 17:17:58.485336 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:58 crc kubenswrapper[4795]: I0320 17:17:58.487167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4795]: I0320 17:17:58.487232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4795]: I0320 17:17:58.487258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4795]: I0320 17:17:58.487300 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:58 crc kubenswrapper[4795]: E0320 17:17:58.491253 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:17:58 crc kubenswrapper[4795]: E0320 17:17:58.496153 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:17:58 crc kubenswrapper[4795]: W0320 17:17:58.629638 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:58 crc kubenswrapper[4795]: E0320 17:17:58.629678 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:17:59 crc kubenswrapper[4795]: I0320 17:17:59.187382 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:00 crc kubenswrapper[4795]: I0320 17:18:00.190118 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:00 crc kubenswrapper[4795]: W0320 17:18:00.417440 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 17:18:00 crc kubenswrapper[4795]: E0320 17:18:00.417517 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:00 crc kubenswrapper[4795]: I0320 17:18:00.764122 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:18:00 crc kubenswrapper[4795]: I0320 17:18:00.789967 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 17:18:01 crc kubenswrapper[4795]: I0320 17:18:01.185304 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:01 crc kubenswrapper[4795]: W0320 17:18:01.748397 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 17:18:01 crc kubenswrapper[4795]: E0320 17:18:01.748465 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.097421 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c35951873db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,LastTimestamp:2026-03-20 17:17:37.179161563 +0000 UTC m=+0.637193124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.104580 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.111320 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.118299 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956e32b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,LastTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.124838 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359f48a00c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.350090764 +0000 UTC m=+0.808122315,LastTimestamp:2026-03-20 17:17:37.350090764 +0000 UTC m=+0.808122315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.132792 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c3599567c51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.352532471 +0000 UTC m=+0.810564022,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.138914 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956c053\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.352556631 +0000 UTC m=+0.810588192,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.146129 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956e32b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956e32b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,LastTimestamp:2026-03-20 17:17:37.352577561 +0000 UTC m=+0.810609112,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.153069 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c3599567c51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.35463137 +0000 UTC m=+0.812662921,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.160345 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956c053\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.354676511 +0000 UTC m=+0.812708062,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.167112 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956e32b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956e32b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,LastTimestamp:2026-03-20 17:17:37.354712912 +0000 UTC m=+0.812744463,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.171311 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.171533 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.173252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.173436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.173569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.174806 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.174797 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c3599567c51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.355393514 +0000 UTC m=+0.813425055,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.175441 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.180089 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956c053\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.355423614 +0000 UTC m=+0.813455155,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.185752 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.186239 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956e32b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956e32b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,LastTimestamp:2026-03-20 17:17:37.355433675 +0000 UTC m=+0.813465206,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.190997 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c3599567c51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.360259266 +0000 UTC m=+0.818290817,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.194962 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956c053\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.360311497 +0000 UTC m=+0.818343048,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.200400 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c3599567c51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.360335898 +0000 UTC m=+0.818367449,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.204835 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956c053\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.360354148 +0000 UTC m=+0.818385699,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.208154 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956e32b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956e32b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,LastTimestamp:2026-03-20 17:17:37.360392269 +0000 UTC m=+0.818423820,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.213419 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956e32b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956e32b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,LastTimestamp:2026-03-20 17:17:37.360417079 +0000 UTC m=+0.818448630,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.220369 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c3599567c51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.363814643 +0000 UTC m=+0.821846194,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.224685 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956c053\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.363832943 +0000 UTC m=+0.821864504,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.231332 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c3599567c51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c3599567c51 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250335825 +0000 UTC m=+0.708367366,LastTimestamp:2026-03-20 17:17:37.363849633 +0000 UTC m=+0.821881184,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.238261 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956e32b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956e32b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250362155 +0000 UTC m=+0.708393696,LastTimestamp:2026-03-20 17:17:37.363868334 +0000 UTC m=+0.821899895,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.241471 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c359956c053\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c359956c053 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.250353235 +0000 UTC m=+0.708384776,LastTimestamp:2026-03-20 17:17:37.363926045 +0000 UTC m=+0.821957596,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.245058 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c35b80c6447 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.765573703 +0000 UTC m=+1.223605284,LastTimestamp:2026-03-20 17:17:37.765573703 +0000 UTC m=+1.223605284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.249310 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c35b84b7bac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.76970846 +0000 UTC m=+1.227740041,LastTimestamp:2026-03-20 17:17:37.76970846 +0000 UTC m=+1.227740041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.252558 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c35b9326702 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.784841986 +0000 UTC m=+1.242873527,LastTimestamp:2026-03-20 17:17:37.784841986 +0000 UTC m=+1.242873527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.255799 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35ba7a3b66 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.80632663 +0000 UTC m=+1.264358171,LastTimestamp:2026-03-20 17:17:37.80632663 +0000 UTC m=+1.264358171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.259356 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c35bab4c563 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:37.810163043 +0000 UTC m=+1.268194584,LastTimestamp:2026-03-20 17:17:37.810163043 +0000 UTC m=+1.268194584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.262199 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c35dc521542 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.37412077 +0000 UTC m=+1.832152321,LastTimestamp:2026-03-20 17:17:38.37412077 +0000 UTC m=+1.832152321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.263961 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c35dc603e53 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.375048787 +0000 UTC m=+1.833080338,LastTimestamp:2026-03-20 17:17:38.375048787 +0000 UTC m=+1.833080338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.267467 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35dc62ae5c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.37520854 +0000 UTC m=+1.833240091,LastTimestamp:2026-03-20 17:17:38.37520854 +0000 UTC m=+1.833240091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.271491 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c35dc63733d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.375258941 +0000 UTC m=+1.833290482,LastTimestamp:2026-03-20 17:17:38.375258941 +0000 UTC m=+1.833290482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.273614 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c35dca5bd53 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.379603283 +0000 UTC m=+1.837634834,LastTimestamp:2026-03-20 17:17:38.379603283 +0000 UTC m=+1.837634834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.277465 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35dd7bb9bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.393627067 +0000 UTC m=+1.851658618,LastTimestamp:2026-03-20 17:17:38.393627067 +0000 UTC m=+1.851658618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.284826 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c35dd92f170 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.395148656 +0000 UTC m=+1.853180207,LastTimestamp:2026-03-20 17:17:38.395148656 +0000 UTC m=+1.853180207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.288763 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c35dd9347f6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.395170806 +0000 UTC m=+1.853202357,LastTimestamp:2026-03-20 17:17:38.395170806 +0000 UTC m=+1.853202357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.295304 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c35dd948cee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.395253998 +0000 UTC m=+1.853285559,LastTimestamp:2026-03-20 17:17:38.395253998 +0000 UTC m=+1.853285559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.299994 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35dd9dceef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.395860719 +0000 UTC m=+1.853892260,LastTimestamp:2026-03-20 17:17:38.395860719 +0000 UTC m=+1.853892260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.303596 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c35dd9ed28d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.395927181 +0000 UTC m=+1.853958732,LastTimestamp:2026-03-20 17:17:38.395927181 +0000 UTC m=+1.853958732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.308924 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35f05817a1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.710058913 +0000 UTC m=+2.168090494,LastTimestamp:2026-03-20 17:17:38.710058913 +0000 UTC m=+2.168090494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.315031 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35f1331641 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.724410945 +0000 UTC m=+2.182442516,LastTimestamp:2026-03-20 17:17:38.724410945 +0000 UTC m=+2.182442516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.321790 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35f1577c3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.72679635 +0000 UTC m=+2.184827931,LastTimestamp:2026-03-20 17:17:38.72679635 +0000 UTC m=+2.184827931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.328579 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3602a327cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.01696814 +0000 UTC m=+2.474999731,LastTimestamp:2026-03-20 17:17:39.01696814 +0000 UTC m=+2.474999731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.334732 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3603acb1c0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.034370496 +0000 UTC m=+2.492402057,LastTimestamp:2026-03-20 17:17:39.034370496 +0000 UTC m=+2.492402057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.341281 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3603d178f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.03678079 +0000 UTC m=+2.494812361,LastTimestamp:2026-03-20 17:17:39.03678079 +0000 UTC m=+2.494812361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.347666 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c361274a270 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.2823548 +0000 UTC m=+2.740386381,LastTimestamp:2026-03-20 17:17:39.2823548 +0000 UTC m=+2.740386381,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.355352 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3612ea32eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.290059499 +0000 UTC m=+2.748091080,LastTimestamp:2026-03-20 17:17:39.290059499 +0000 UTC m=+2.748091080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.360388 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c3613691f37 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.298377527 +0000 UTC m=+2.756409098,LastTimestamp:2026-03-20 17:17:39.298377527 +0000 UTC m=+2.756409098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.367100 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c361375fe7f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.299221119 +0000 UTC m=+2.757252690,LastTimestamp:2026-03-20 17:17:39.299221119 +0000 UTC m=+2.757252690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.375345 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3615417ea0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.329334944 +0000 UTC m=+2.787366515,LastTimestamp:2026-03-20 17:17:39.329334944 +0000 UTC m=+2.787366515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.382913 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3617ce7aa1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.372128929 +0000 UTC m=+2.830160510,LastTimestamp:2026-03-20 17:17:39.372128929 +0000 UTC m=+2.830160510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.390562 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c3623508d50 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.565202768 +0000 UTC m=+3.023234349,LastTimestamp:2026-03-20 17:17:39.565202768 +0000 UTC m=+3.023234349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.396499 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c3627133156 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.62829039 +0000 UTC m=+3.086321941,LastTimestamp:2026-03-20 17:17:39.62829039 +0000 UTC m=+3.086321941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.400725 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c3627229e73 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.629301363 +0000 UTC m=+3.087332914,LastTimestamp:2026-03-20 17:17:39.629301363 +0000 UTC m=+3.087332914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.406239 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c36272849e1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.629672929 +0000 UTC m=+3.087704490,LastTimestamp:2026-03-20 17:17:39.629672929 +0000 UTC m=+3.087704490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.410914 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3627dac26a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.641369194 +0000 UTC m=+3.099400745,LastTimestamp:2026-03-20 17:17:39.641369194 +0000 UTC m=+3.099400745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.417178 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c3627ea9e3d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.642408509 +0000 UTC m=+3.100440050,LastTimestamp:2026-03-20 17:17:39.642408509 +0000 UTC m=+3.100440050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.421919 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c3629f10041 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.676381249 +0000 UTC m=+3.134412820,LastTimestamp:2026-03-20 17:17:39.676381249 +0000 UTC m=+3.134412820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.425999 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c362a0f6ead openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.678375597 +0000 UTC m=+3.136407178,LastTimestamp:2026-03-20 17:17:39.678375597 +0000 UTC m=+3.136407178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.431549 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c362a1568a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.678767273 +0000 UTC m=+3.136798844,LastTimestamp:2026-03-20 17:17:39.678767273 +0000 UTC m=+3.136798844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.435903 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c362a34835a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:39.680805722 +0000 UTC m=+3.138837293,LastTimestamp:2026-03-20 17:17:39.680805722 +0000 UTC m=+3.138837293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.441342 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3644ece34f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.129096527 +0000 UTC m=+3.587128098,LastTimestamp:2026-03-20 17:17:40.129096527 +0000 UTC m=+3.587128098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.445819 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c3644f82057 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.129833047 +0000 UTC m=+3.587864618,LastTimestamp:2026-03-20 17:17:40.129833047 +0000 UTC m=+3.587864618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.452073 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c3648724d2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.188171562 +0000 UTC m=+3.646203133,LastTimestamp:2026-03-20 17:17:40.188171562 +0000 UTC m=+3.646203133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.456198 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c36488cca96 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.189907606 +0000 UTC m=+3.647939187,LastTimestamp:2026-03-20 17:17:40.189907606 +0000 UTC m=+3.647939187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.460221 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c36496241ec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.203897324 +0000 UTC m=+3.661928905,LastTimestamp:2026-03-20 17:17:40.203897324 +0000 UTC m=+3.661928905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.464194 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3649971ed9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.207361753 +0000 UTC m=+3.665393334,LastTimestamp:2026-03-20 17:17:40.207361753 +0000 UTC m=+3.665393334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.468619 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c3650360f9c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.318441372 +0000 UTC m=+3.776472923,LastTimestamp:2026-03-20 17:17:40.318441372 +0000 UTC m=+3.776472923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.474712 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c3658bf95bb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.461671867 +0000 UTC m=+3.919703418,LastTimestamp:2026-03-20 17:17:40.461671867 +0000 UTC m=+3.919703418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.478917 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3658eb1303 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.464521987 +0000 UTC m=+3.922553538,LastTimestamp:2026-03-20 17:17:40.464521987 +0000 UTC m=+3.922553538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.482920 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c365a27075c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.48522838 +0000 UTC m=+3.943259931,LastTimestamp:2026-03-20 17:17:40.48522838 +0000 UTC m=+3.943259931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.487535 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c365a448379 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.487160697 +0000 UTC m=+3.945192258,LastTimestamp:2026-03-20 17:17:40.487160697 +0000 UTC m=+3.945192258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.491554 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c365a4c1078 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.487655544 +0000 UTC m=+3.945687085,LastTimestamp:2026-03-20 17:17:40.487655544 +0000 UTC m=+3.945687085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.497306 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c365faee820 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.57801936 +0000 UTC m=+4.036050901,LastTimestamp:2026-03-20 17:17:40.57801936 +0000 UTC m=+4.036050901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.502100 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c3660acd2ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.594660026 +0000 UTC m=+4.052691557,LastTimestamp:2026-03-20 17:17:40.594660026 +0000 UTC m=+4.052691557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.506733 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c366905fcf1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.734721265 +0000 UTC m=+4.192752806,LastTimestamp:2026-03-20 17:17:40.734721265 +0000 UTC m=+4.192752806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.511442 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3669baaf05 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.746563333 +0000 UTC m=+4.204594874,LastTimestamp:2026-03-20 17:17:40.746563333 +0000 UTC m=+4.204594874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.515735 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3669c7f9e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.747434465 +0000 UTC m=+4.205466006,LastTimestamp:2026-03-20 17:17:40.747434465 +0000 UTC m=+4.205466006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.520012 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3673e843b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.917322676 +0000 UTC m=+4.375354237,LastTimestamp:2026-03-20 17:17:40.917322676 +0000 UTC m=+4.375354237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.525398 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3674813f59 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.927348569 +0000 UTC m=+4.385380140,LastTimestamp:2026-03-20 17:17:40.927348569 +0000 UTC m=+4.385380140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.531080 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c368cc26903 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:41.334272259 +0000 UTC m=+4.792303830,LastTimestamp:2026-03-20 17:17:41.334272259 +0000 UTC m=+4.792303830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.536725 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c369a07755f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:41.556901215 +0000 UTC m=+5.014932796,LastTimestamp:2026-03-20 17:17:41.556901215 +0000 UTC m=+5.014932796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.541513 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c369ac41e2f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:41.569265199 +0000 UTC m=+5.027296750,LastTimestamp:2026-03-20 17:17:41.569265199 +0000 UTC m=+5.027296750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.545946 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c369ad7602e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:41.570527278 +0000 UTC m=+5.028558829,LastTimestamp:2026-03-20 17:17:41.570527278 +0000 UTC m=+5.028558829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.552025 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36acd1878e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:41.87213403 +0000 UTC m=+5.330165571,LastTimestamp:2026-03-20 17:17:41.87213403 +0000 UTC m=+5.330165571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.554148 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36add1113d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:41.888880957 +0000 UTC m=+5.346912498,LastTimestamp:2026-03-20 17:17:41.888880957 +0000 UTC m=+5.346912498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.558002 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36ade45f07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:41.890146055 +0000 UTC m=+5.348177626,LastTimestamp:2026-03-20 17:17:41.890146055 +0000 UTC m=+5.348177626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.563266 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36bbc39bb4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.122879924 +0000 UTC m=+5.580911505,LastTimestamp:2026-03-20 17:17:42.122879924 +0000 UTC m=+5.580911505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.568042 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36bc8f3ae2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.136224482 +0000 UTC m=+5.594256063,LastTimestamp:2026-03-20 17:17:42.136224482 +0000 UTC m=+5.594256063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.573378 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36bca2873a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.13748921 +0000 UTC m=+5.595520791,LastTimestamp:2026-03-20 17:17:42.13748921 +0000 UTC m=+5.595520791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.580154 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36cc0d30c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.396137665 +0000 UTC m=+5.854169216,LastTimestamp:2026-03-20 17:17:42.396137665 +0000 UTC m=+5.854169216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.585063 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36ccdf4b88 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.40990708 +0000 UTC m=+5.867938631,LastTimestamp:2026-03-20 17:17:42.40990708 +0000 UTC m=+5.867938631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.589440 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36ccf47cba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.41129593 +0000 UTC m=+5.869327471,LastTimestamp:2026-03-20 17:17:42.41129593 +0000 UTC m=+5.869327471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.593709 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36db351c93 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.650412179 +0000 UTC m=+6.108443750,LastTimestamp:2026-03-20 17:17:42.650412179 +0000 UTC m=+6.108443750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.598037 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c36dc27b46b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:42.666310763 +0000 UTC m=+6.124342344,LastTimestamp:2026-03-20 17:17:42.666310763 +0000 UTC m=+6.124342344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.602914 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c370a503f52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:02 crc kubenswrapper[4795]: body: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:43.440719698 +0000 UTC m=+6.898751279,LastTimestamp:2026-03-20 17:17:43.440719698 +0000 UTC m=+6.898751279,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.607802 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c370a518bb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:43.44080479 +0000 UTC m=+6.898836361,LastTimestamp:2026-03-20 17:17:43.44080479 +0000 UTC m=+6.898836361,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.616633 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc4ca46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:18:02 crc kubenswrapper[4795]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:18:02 crc kubenswrapper[4795]: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088623686 +0000 UTC m=+15.546655227,LastTimestamp:2026-03-20 17:17:52.088623686 +0000 UTC m=+15.546655227,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.621921 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc568fd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088664317 +0000 UTC m=+15.546695858,LastTimestamp:2026-03-20 17:17:52.088664317 +0000 UTC m=+15.546695858,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.627432 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c390dc4ca46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc4ca46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:18:02 crc kubenswrapper[4795]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:18:02 crc kubenswrapper[4795]: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088623686 +0000 UTC m=+15.546655227,LastTimestamp:2026-03-20 17:17:52.093232092 +0000 UTC m=+15.551263643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.632281 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c390dc568fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc568fd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088664317 +0000 UTC m=+15.546695858,LastTimestamp:2026-03-20 17:17:52.093280544 +0000 UTC m=+15.551312105,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.637552 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c3912c2512d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 20 17:18:02 crc kubenswrapper[4795]: body: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.172347693 +0000 UTC m=+15.630379244,LastTimestamp:2026-03-20 17:17:52.172347693 +0000 UTC m=+15.630379244,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.642766 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3912c30bc9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.172395465 +0000 UTC m=+15.630427026,LastTimestamp:2026-03-20 17:17:52.172395465 +0000 UTC m=+15.630427026,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.648586 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c3669c7f9e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3669c7f9e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.747434465 +0000 UTC m=+4.205466006,LastTimestamp:2026-03-20 17:17:52.388756377 +0000 UTC m=+15.846787908,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.655179 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c395e623ebd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded Mar 20 17:18:02 crc kubenswrapper[4795]: body: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:53.441119933 +0000 UTC m=+16.899151474,LastTimestamp:2026-03-20 17:17:53.441119933 +0000 UTC m=+16.899151474,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.660679 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c395e6341a7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:53.441186215 +0000 UTC m=+16.899217756,LastTimestamp:2026-03-20 17:17:53.441186215 +0000 UTC m=+16.899217756,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.972107 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.972356 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.974034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.974096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.974114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.975090 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.975400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.189260 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.440868 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.440966 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.441035 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.441221 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.443672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.443793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.443819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.445289 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.445634 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76" gracePeriod=30 Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.450013 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:03 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26b5046 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:03 crc kubenswrapper[4795]: body: Mar 20 17:18:03 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,LastTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:03 crc kubenswrapper[4795]: > Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.455890 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26c53bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.441001404 +0000 UTC m=+26.899032995,LastTimestamp:2026-03-20 17:18:03.441001404 +0000 UTC m=+26.899032995,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.463415 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb2b27973 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.445598579 +0000 UTC m=+26.903630130,LastTimestamp:2026-03-20 17:18:03.445598579 +0000 UTC m=+26.903630130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.581396 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c35dd9dceef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35dd9dceef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.395860719 +0000 UTC m=+1.853892260,LastTimestamp:2026-03-20 17:18:03.576304096 +0000 UTC m=+27.034335647,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.815807 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c35f05817a1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35f05817a1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.710058913 +0000 UTC m=+2.168090494,LastTimestamp:2026-03-20 17:18:03.808653271 +0000 UTC m=+27.266684822,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.828309 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c35f1331641\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35f1331641 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.724410945 +0000 UTC m=+2.182442516,LastTimestamp:2026-03-20 17:18:03.821507722 +0000 UTC m=+27.279539303,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.188708 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.439448 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.439954 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76" exitCode=255 Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.440000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76"} Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.440026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7"} Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.440117 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.441262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.441327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.441344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.189642 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.492084 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493725 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:05 crc kubenswrapper[4795]: E0320 17:18:05.501482 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:05 crc kubenswrapper[4795]: E0320 17:18:05.501809 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:06 crc kubenswrapper[4795]: I0320 17:18:06.187288 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:07 crc kubenswrapper[4795]: I0320 17:18:07.187804 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:07 crc kubenswrapper[4795]: E0320 17:18:07.354435 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:08 crc kubenswrapper[4795]: I0320 17:18:08.184770 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:09 crc kubenswrapper[4795]: I0320 17:18:09.187531 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:09 crc kubenswrapper[4795]: W0320 17:18:09.213066 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 17:18:09 crc kubenswrapper[4795]: E0320 17:18:09.213151 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.187228 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.381180 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.381412 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.383011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.383113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.383135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.440251 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.456574 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.458303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.458501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.458640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:11 crc kubenswrapper[4795]: I0320 17:18:11.187780 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:11 crc kubenswrapper[4795]: W0320 17:18:11.845101 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:11 crc kubenswrapper[4795]: E0320 17:18:11.845182 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.187037 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.501927 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503454 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:12 crc kubenswrapper[4795]: E0320 17:18:12.509504 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:12 crc kubenswrapper[4795]: E0320 17:18:12.509538 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:13 crc kubenswrapper[4795]: I0320 17:18:13.186840 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:13 crc kubenswrapper[4795]: I0320 17:18:13.440895 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:18:13 crc kubenswrapper[4795]: I0320 17:18:13.440993 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:18:13 crc kubenswrapper[4795]: E0320 17:18:13.448108 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c3bb26b5046\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:13 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26b5046 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:13 crc kubenswrapper[4795]: body: Mar 20 17:18:13 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,LastTimestamp:2026-03-20 17:18:13.440968726 +0000 UTC m=+36.899000307,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:13 crc kubenswrapper[4795]: > Mar 20 17:18:13 crc kubenswrapper[4795]: E0320 17:18:13.454984 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c3bb26c53bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26c53bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.441001404 +0000 UTC m=+26.899032995,LastTimestamp:2026-03-20 17:18:13.441029047 +0000 UTC m=+36.899060618,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:14 crc kubenswrapper[4795]: I0320 17:18:14.187548 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:15 crc kubenswrapper[4795]: I0320 17:18:15.187245 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.187991 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.252104 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.254033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.254113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.254136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.255156 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.188964 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:17 crc kubenswrapper[4795]: E0320 17:18:17.354762 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.479727 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.482463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db"} Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.482641 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.483906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.483967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.483992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.184154 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.488569 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.489422 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493199 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" exitCode=255 Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db"} Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493353 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493564 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.495129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.495184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.495219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.496142 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:18 crc kubenswrapper[4795]: E0320 17:18:18.496492 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.186804 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.498602 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.510527 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512319 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:19 crc kubenswrapper[4795]: E0320 17:18:19.518135 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:19 crc kubenswrapper[4795]: E0320 17:18:19.518345 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:20 crc kubenswrapper[4795]: I0320 17:18:20.186441 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:21 crc kubenswrapper[4795]: W0320 17:18:21.033929 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 17:18:21 crc kubenswrapper[4795]: E0320 17:18:21.033992 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:21 crc kubenswrapper[4795]: I0320 17:18:21.184620 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.171297 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.171507 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.173190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.173258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.173279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.174106 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:22 crc kubenswrapper[4795]: E0320 17:18:22.174392 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.187501 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:22 crc kubenswrapper[4795]: W0320 17:18:22.530681 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 17:18:22 crc kubenswrapper[4795]: E0320 17:18:22.530812 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.971848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.972104 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.973905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.973975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.973997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.974977 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:22 crc kubenswrapper[4795]: E0320 17:18:22.975281 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:23 crc kubenswrapper[4795]: W0320 17:18:23.146200 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 17:18:23 crc kubenswrapper[4795]: E0320 17:18:23.146295 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:23 crc kubenswrapper[4795]: I0320 17:18:23.188358 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:23 crc kubenswrapper[4795]: I0320 17:18:23.440908 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:18:23 crc kubenswrapper[4795]: I0320 17:18:23.441014 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:18:23 crc kubenswrapper[4795]: E0320 17:18:23.447589 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c3bb26b5046\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:23 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26b5046 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:23 crc kubenswrapper[4795]: body: Mar 20 17:18:23 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,LastTimestamp:2026-03-20 17:18:23.440986491 +0000 UTC m=+46.899018062,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:23 crc kubenswrapper[4795]: > Mar 20 17:18:24 crc kubenswrapper[4795]: I0320 17:18:24.187654 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:25 crc kubenswrapper[4795]: I0320 17:18:25.185936 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.187075 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.518314 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519984 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:26 crc kubenswrapper[4795]: E0320 17:18:26.526220 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:26 crc kubenswrapper[4795]: E0320 17:18:26.526566 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:27 crc kubenswrapper[4795]: I0320 17:18:27.187995 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:27 crc kubenswrapper[4795]: E0320 17:18:27.355040 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:28 crc kubenswrapper[4795]: I0320 17:18:28.187971 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:29 crc kubenswrapper[4795]: I0320 17:18:29.186746 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:30 crc kubenswrapper[4795]: I0320 17:18:30.188436 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:31 crc kubenswrapper[4795]: I0320 17:18:31.187372 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.187023 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.400060 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.400213 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.401316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.401355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.401365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.402978 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.535417 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.536472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.536538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.536557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.186376 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.527044 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528147 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:33 crc kubenswrapper[4795]: E0320 17:18:33.533017 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:33 crc kubenswrapper[4795]: E0320 17:18:33.533243 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.920358 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.920551 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.921676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.921754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.921769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:34 crc kubenswrapper[4795]: I0320 17:18:34.184282 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:35 crc kubenswrapper[4795]: I0320 17:18:35.185637 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:36 crc kubenswrapper[4795]: I0320 17:18:36.186801 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.185823 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.251314 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.252498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.252572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.252589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.253816 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:37 crc kubenswrapper[4795]: E0320 17:18:37.254061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:37 crc kubenswrapper[4795]: E0320 17:18:37.355844 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:38 crc kubenswrapper[4795]: I0320 17:18:38.184596 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:39 crc kubenswrapper[4795]: I0320 17:18:39.185779 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.188746 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.534013 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535270 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:40 crc kubenswrapper[4795]: E0320 17:18:40.541318 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:40 crc kubenswrapper[4795]: E0320 17:18:40.541592 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.193427 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.818287 4795 csr.go:261] certificate signing request csr-l2n5d is approved, waiting to be issued Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.830147 4795 csr.go:257] certificate signing request csr-l2n5d is issued Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.910516 4795 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.028745 4795 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.252084 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.253602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.253642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.253653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.832155 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-24 17:50:32.950082161 +0000 UTC Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.832197 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5976h31m50.117887473s for next certificate rotation Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.356324 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.541830 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543647 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543885 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.554849 4795 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.555186 4795 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.555223 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559776 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.578806 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.601559 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611342 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611457 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.627070 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638365 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.652858 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.653077 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.653108 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.753950 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.854938 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.955988 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.056093 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.156248 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.256351 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.357408 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.458108 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.558501 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.659637 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.760709 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.861837 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.962766 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.063733 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.164439 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.251701 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.252962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.253027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.253045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.254034 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.265249 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.366269 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.467362 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.567854 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.579827 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.582086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1"} Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.582253 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.583351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.583392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.583404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.668319 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.768805 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.869854 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.971054 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.071623 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.172169 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.272737 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.373536 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.474668 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.575393 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.587782 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.588784 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591274 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" exitCode=255 Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1"} Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591372 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591609 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.593294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.593337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.593354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.594181 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.594456 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.676270 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.776654 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.877229 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.977815 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.078803 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.179133 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.279199 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.380367 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.481087 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.581759 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: I0320 17:18:51.597377 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.681919 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.782257 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.882660 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.983744 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.084322 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.171629 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.171939 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.173881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.173958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.173976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.175050 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.175314 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.185012 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.286191 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.386878 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.487152 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.588153 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.688779 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.788922 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.889952 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.972661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.972906 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.974419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.974472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.974489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.975569 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.975992 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.990768 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: E0320 17:18:53.091354 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: E0320 17:18:53.191990 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: E0320 17:18:53.292924 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.328360 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396210 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499373 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602168 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704660 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808268 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910940 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013749 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116740 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.209121 4795 apiserver.go:52] "Watching apiserver" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.214782 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.215125 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.215667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.215734 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.215791 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216176 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216177 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.216923 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.216815 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219418 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219390 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219946 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220117 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220769 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220775 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220811 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.223214 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.223673 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.255073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.269670 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.287511 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.288744 4795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289882 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289976 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290156 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290187 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290337 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290431 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291656 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292026 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292070 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292127 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290610 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292976 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293044 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293569 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293588 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294492 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295025 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295060 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296185 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296236 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296282 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296326 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296462 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296609 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297191 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297475 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297614 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297822 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298158 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298255 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298303 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298547 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298595 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298641 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.299491 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300578 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300875 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301073 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301124 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301165 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301214 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301262 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301359 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301494 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301669 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302081 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302168 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302171 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302359 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302455 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302500 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302834 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303309 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303622 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304158 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304309 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304460 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308453 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308785 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308842 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309042 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309170 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309394 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309466 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309650 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310032 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310089 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310153 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310497 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310564 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310628 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310780 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310907 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310961 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311142 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311322 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312001 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312063 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302576 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302550 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302824 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304394 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.305659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.307058 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.307371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.307387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308329 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308897 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309456 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310491 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310653 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310755 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311604 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312187 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.314871 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.314979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.315354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.315990 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.316593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317167 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.318208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.318858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.315926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319662 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.318311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312788 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322207 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322973 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323551 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324243 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324760 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324775 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324789 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.326254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.326714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327244 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327403 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327622 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327807 4795 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319161 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319591 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.328958 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329059 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319560 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.329495 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.829450216 +0000 UTC m=+78.287481797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330426 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330454 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.330489 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330481 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.330674 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.830650924 +0000 UTC m=+78.288682475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330985 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331196 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.331448 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.831411158 +0000 UTC m=+78.289442709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331476 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331494 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331509 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.333007 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.333988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.334473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.335230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.335629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.337223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.337513 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.338177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.339653 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.339888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.340238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.340361 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.344610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342800 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341276 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341482 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341788 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342173 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342582 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.343320 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.345483 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.345497 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.344882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345267 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342800 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.344787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.345555 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.845535785 +0000 UTC m=+78.303567336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345579 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345600 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345616 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345634 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345650 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345667 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345702 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345720 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345732 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345744 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345754 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345766 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345778 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345790 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345801 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345815 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345826 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345841 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345853 4795 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345867 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345880 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345891 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345903 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345914 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345925 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345937 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345949 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345961 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345972 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345983 4795 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345994 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346005 4795 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346018 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346031 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346042 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346053 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346064 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346075 4795 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346086 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346196 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346726 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.347852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.348071 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.349944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.352499 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.354638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.355172 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.355509 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.355898 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.356055 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.356069 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356101 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356132 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356154 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.356353 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356500 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.856446313 +0000 UTC m=+78.314477954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.359002 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.360219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361431 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.362217 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.363137 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.363156 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.364173 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.365098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.365277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.366106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.372183 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.376083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.377274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.378407 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.383811 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.390852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446839 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446849 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446868 4795 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446900 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446924 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446947 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446973 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447058 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447079 4795 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447100 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447121 4795 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447141 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447160 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447177 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447194 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447211 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447228 4795 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447244 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447261 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447281 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447298 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447315 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447333 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447351 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447369 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447387 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447404 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447421 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447438 4795 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447454 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447471 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447488 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447505 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447522 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447539 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447560 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447583 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447618 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447636 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447653 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447671 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447728 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447754 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447775 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447792 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447808 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447827 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447853 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447877 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447902 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447923 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447940 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447957 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447977 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447994 4795 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448010 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448031 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448057 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448080 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448116 4795 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448140 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448163 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448185 4795 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448264 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448300 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448371 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448413 4795 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448428 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448440 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448452 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448464 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448475 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448486 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448497 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448507 4795 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448519 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448532 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448545 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448556 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448568 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448579 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448590 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448601 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448612 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448623 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448636 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448647 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448659 4795 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448669 4795 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448699 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448711 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448723 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448735 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448746 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448759 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448770 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448781 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448791 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448803 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448814 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448824 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448836 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448846 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448857 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448868 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448879 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448890 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448901 4795 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448912 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448923 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448935 4795 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448948 4795 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448959 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448969 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448981 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448991 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449002 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449014 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449024 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449035 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449046 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449057 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449067 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449078 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449089 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449101 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449112 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449123 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449138 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449149 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449160 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449172 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449183 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449194 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449205 4795 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449217 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449231 4795 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449246 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449260 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449275 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449288 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449304 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449322 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449338 4795 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449354 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449369 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449384 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449399 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449413 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449428 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449440 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449451 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.538533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.551253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.552352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.552509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.552658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.553051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.553197 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.553376 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 20 17:18:54 crc kubenswrapper[4795]: else Mar 20 17:18:54 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:18:54 crc kubenswrapper[4795]: exit 1 Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.554630 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.565639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: W0320 17:18:54.570403 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414 WatchSource:0}: Error finding container 48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414: Status 404 returned error can't find the container with id 48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414 Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.573556 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:18:54 crc kubenswrapper[4795]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:18:54 crc kubenswrapper[4795]: ho_enable="--enable-hybrid-overlay" Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:18:54 crc kubenswrapper[4795]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:18:54 crc kubenswrapper[4795]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-host=127.0.0.1 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-port=9743 \ Mar 20 17:18:54 crc kubenswrapper[4795]: ${ho_enable} \ Mar 20 17:18:54 crc kubenswrapper[4795]: --enable-interconnect \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-approver \ Mar 20 17:18:54 crc kubenswrapper[4795]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --wait-for-kubernetes-api=200s \ Mar 20 17:18:54 crc kubenswrapper[4795]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.577190 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-webhook \ Mar 20 17:18:54 crc kubenswrapper[4795]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.578478 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.584529 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.585828 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.610182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7c6e2fcdb8d74e6bd0b7cd1a4feb61a303cc77b9d56a711cfd374e3ae28af2a4"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.611102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.612406 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cd6841410ce51491c24386da36797741c3e636d6a01a69f875e919885c6a19b8"} Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.612828 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:18:54 crc kubenswrapper[4795]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:18:54 crc kubenswrapper[4795]: ho_enable="--enable-hybrid-overlay" Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:18:54 crc kubenswrapper[4795]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:18:54 crc kubenswrapper[4795]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-host=127.0.0.1 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-port=9743 \ Mar 20 17:18:54 crc kubenswrapper[4795]: ${ho_enable} \ Mar 20 17:18:54 crc kubenswrapper[4795]: --enable-interconnect \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-approver \ Mar 20 17:18:54 crc kubenswrapper[4795]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --wait-for-kubernetes-api=200s \ Mar 20 17:18:54 crc kubenswrapper[4795]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.612837 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.613763 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 20 17:18:54 crc kubenswrapper[4795]: else Mar 20 17:18:54 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:18:54 crc kubenswrapper[4795]: exit 1 Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.615365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.615424 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.616418 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-webhook \ Mar 20 17:18:54 crc kubenswrapper[4795]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.619271 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.621878 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.635817 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.647589 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.656983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657016 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657064 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.658816 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.670059 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.684558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.694653 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.705844 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.720762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.732901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.744492 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.759173 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.760954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.760998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.761029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.761053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.761064 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853612 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.853948 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.853994 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.853965141 +0000 UTC m=+79.311996712 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854072 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.854027983 +0000 UTC m=+79.312059564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854101 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854122 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854147 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.854133086 +0000 UTC m=+79.312164857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854162 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854187 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854257 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.854233869 +0000 UTC m=+79.312265460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863843 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.954960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955305 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955386 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955412 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955604 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.955570588 +0000 UTC m=+79.413602179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173134 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173191 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.258544 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.259635 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.262068 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.263306 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.265613 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.266611 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.267940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.269948 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.271250 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.273937 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.275979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276105 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276777 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.279324 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.280494 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.281550 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.283407 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.284499 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.286538 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.287626 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.289060 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.291484 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.292629 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.294717 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.295917 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.298291 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.299358 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.300766 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.302932 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.304097 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.306617 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.308198 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.310262 4795 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.310552 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.314111 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.315215 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.316976 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.320099 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.321390 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.323415 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.325180 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.327598 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.328674 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.331043 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.332507 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.334824 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.335829 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.337785 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.338916 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.341553 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.342827 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.344788 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.345907 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.347958 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.349476 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.350748 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483466 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587331 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691166 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794898 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863518 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.863480036 +0000 UTC m=+81.321511607 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863620 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863659 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863743 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863756 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863629 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863822 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.863807625 +0000 UTC m=+81.321839206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863851 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.863837826 +0000 UTC m=+81.321869407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.864042 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.864120 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.864101995 +0000 UTC m=+81.322133596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898376 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.959608 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.964595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964798 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964830 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964849 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964921 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.964899816 +0000 UTC m=+81.422931387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001555 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104880 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207734 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.251194 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.251241 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:56 crc kubenswrapper[4795]: E0320 17:18:56.251365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:56 crc kubenswrapper[4795]: E0320 17:18:56.251491 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.251636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:56 crc kubenswrapper[4795]: E0320 17:18:56.251982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.603986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604118 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707625 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810598 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.913022 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015575 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.084085 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118177 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118230 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.221005 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.267846 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.283218 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.299896 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.315887 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.323955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324080 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.331832 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.345052 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530736 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633433 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736239 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839514 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881542 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.881512755 +0000 UTC m=+85.339544326 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881621 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881764 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.88167834 +0000 UTC m=+85.339709921 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881904 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881941 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881948 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881965 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.882053 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.88202613 +0000 UTC m=+85.340057711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.882089 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.882070611 +0000 UTC m=+85.340102192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942969 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.960935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.960996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.961018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.961045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.961068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.977997 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.982957 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983188 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983224 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983244 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983328 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.983305597 +0000 UTC m=+85.441337178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.000114 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004484 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.020422 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024537 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.039135 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047385 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.063879 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.064062 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066247 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.251702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.251761 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.251792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.251949 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.251992 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.252058 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271835 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.374994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375135 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478408 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580637 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683177 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786792 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889509 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.993008 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095754 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198205 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301238 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404338 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507664 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.610966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611083 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713276 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918675 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022251 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125841 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.229016 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.251294 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.251392 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.251297 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:00 crc kubenswrapper[4795]: E0320 17:19:00.251448 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:00 crc kubenswrapper[4795]: E0320 17:19:00.251550 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:00 crc kubenswrapper[4795]: E0320 17:19:00.251665 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332277 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435543 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538237 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640814 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743179 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.845902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.845973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.845999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.846028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.846050 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949306 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052770 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.155984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258477 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258520 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361448 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464222 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566674 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669754 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773192 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876773 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920458 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.920416868 +0000 UTC m=+93.378448449 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920630 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920805 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920825 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920854 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920896 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920907 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.920880173 +0000 UTC m=+93.378911744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.921008 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.920987956 +0000 UTC m=+93.379019537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.921039 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.921023838 +0000 UTC m=+93.379055559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979755 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.021921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.022752 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.022811 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.022926 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.023063 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:10.023035126 +0000 UTC m=+93.481066697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186304 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.251778 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.251813 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.251852 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.251944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.252125 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.252251 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289571 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392229 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495170 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.599097 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.805442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806997 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012965 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.115949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116075 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218902 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321835 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.425106 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528596 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.631286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.631629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.631890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.632070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.632248 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.735303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.735824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.736006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.736164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.736299 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943482 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047375 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047433 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150584 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.251638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.251986 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:04 crc kubenswrapper[4795]: E0320 17:19:04.251833 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:04 crc kubenswrapper[4795]: E0320 17:19:04.252224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:04 crc kubenswrapper[4795]: E0320 17:19:04.252351 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253742 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.356923 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357102 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.459983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460137 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563420 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666513 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.769922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.769987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.770010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.770037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.770073 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.872899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.872968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.872993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.873023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.873045 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975773 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078493 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181165 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283242 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386271 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.590908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.590967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.590988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.591011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.591027 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693600 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107628 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210153 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.251113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.251223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.251363 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.251739 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.252333 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.252570 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.253774 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:19:06 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:19:06 crc kubenswrapper[4795]: set -o allexport Mar 20 17:19:06 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:19:06 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 20 17:19:06 crc kubenswrapper[4795]: else Mar 20 17:19:06 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:19:06 crc kubenswrapper[4795]: exit 1 Mar 20 17:19:06 crc kubenswrapper[4795]: fi Mar 20 17:19:06 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:19:06 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:19:06 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.255040 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.269660 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.270044 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.270933 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312907 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.518951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519090 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.622871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.622948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.622971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.623003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.623024 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.641783 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.644113 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.644364 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725878 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829140 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238993 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.264720 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.275675 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.291681 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.306043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.321962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.338557 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341766 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.352076 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.443945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444063 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.546999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547119 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.649924 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650076 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.855970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856070 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958761 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060804 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.252028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.252076 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.252208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.252212 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.252276 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.252421 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265955 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.279971 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.284993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285099 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.296098 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300330 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.314389 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319492 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.329603 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333515 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.344546 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.344730 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346322 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.650641 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.650716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.651972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652078 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.666146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.681185 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.693843 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.704895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.721014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.730587 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.738105 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754723 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962807 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065828 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272492 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375467 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478632 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.581933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.581994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.582012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.582035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.582052 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787497 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890378 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991657 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992067 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992103 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992128 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992159 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992107 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992060021 +0000 UTC m=+109.450091622 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992557 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992534806 +0000 UTC m=+109.450566387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992579 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992567757 +0000 UTC m=+109.450599328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992237 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992624 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992613628 +0000 UTC m=+109.450645199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993564 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.093069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093274 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093316 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093337 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093422 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:26.09339832 +0000 UTC m=+109.551429901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095787 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198163 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198188 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.251369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.251516 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.251397 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.251588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.251807 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.251912 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300741 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.403003 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.573077 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-f47gv"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.573550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.576544 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.577132 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.580216 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.602043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608378 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.624092 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.643376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.659957 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.680673 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.698195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-hosts-file\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.698322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xsdx\" (UniqueName: \"kubernetes.io/projected/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-kube-api-access-8xsdx\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.699094 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710561 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710636 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.713516 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.730390 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.799215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xsdx\" (UniqueName: \"kubernetes.io/projected/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-kube-api-access-8xsdx\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.799366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-hosts-file\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.799490 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-hosts-file\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813169 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.822960 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xsdx\" (UniqueName: \"kubernetes.io/projected/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-kube-api-access-8xsdx\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.892655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: W0320 17:19:10.907138 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ee11f2_6451_4d59_8c55_ffcb0ea973a1.slice/crio-5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b WatchSource:0}: Error finding container 5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b: Status 404 returned error can't find the container with id 5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915905 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.962519 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zb4r9"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.965183 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xxwb6"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.965523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.965738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mvxvt"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.966147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.966543 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxwb6" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.969807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970087 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970873 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970894 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971066 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971179 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971184 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971724 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.972016 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.972264 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.972617 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.988499 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.002070 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.017209 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020504 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.071759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.087260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102570 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102633 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-system-cni-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-system-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-os-release\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-netns\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8702afd1-abd3-42d0-91e6-048802e98829-rootfs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cni-binary-copy\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-kubelet\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-daemon-config\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-etc-kubernetes\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-binary-copy\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cnibin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-bin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-multus-certs\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-multus\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-os-release\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwcq\" (UniqueName: \"kubernetes.io/projected/d4f0d908-7a54-4fb3-a52d-51d088632c62-kube-api-access-pvwcq\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-conf-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103574 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8702afd1-abd3-42d0-91e6-048802e98829-proxy-tls\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103669 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xmjs\" (UniqueName: \"kubernetes.io/projected/8702afd1-abd3-42d0-91e6-048802e98829-kube-api-access-7xmjs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-cnibin\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-hostroot\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103974 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-k8s-cni-cncf-io\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxtbp\" (UniqueName: \"kubernetes.io/projected/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-kube-api-access-zxtbp\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104038 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104177 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-socket-dir-parent\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8702afd1-abd3-42d0-91e6-048802e98829-mcd-auth-proxy-config\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.116595 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.128978 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.143292 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.155908 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.169998 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.182865 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.198288 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.204969 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-system-cni-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-system-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-system-cni-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-system-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-os-release\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-netns\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-netns\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8702afd1-abd3-42d0-91e6-048802e98829-rootfs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-os-release\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cni-binary-copy\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8702afd1-abd3-42d0-91e6-048802e98829-rootfs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-kubelet\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-daemon-config\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-etc-kubernetes\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-kubelet\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-binary-copy\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205800 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-etc-kubernetes\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cnibin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-bin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cnibin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-multus-certs\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-bin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-multus\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-multus-certs\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-os-release\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-multus\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwcq\" (UniqueName: \"kubernetes.io/projected/d4f0d908-7a54-4fb3-a52d-51d088632c62-kube-api-access-pvwcq\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-os-release\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-conf-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206142 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8702afd1-abd3-42d0-91e6-048802e98829-proxy-tls\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206175 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xmjs\" (UniqueName: \"kubernetes.io/projected/8702afd1-abd3-42d0-91e6-048802e98829-kube-api-access-7xmjs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-cnibin\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206253 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-hostroot\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-k8s-cni-cncf-io\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxtbp\" (UniqueName: \"kubernetes.io/projected/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-kube-api-access-zxtbp\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206383 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-cnibin\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-socket-dir-parent\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206417 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-conf-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8702afd1-abd3-42d0-91e6-048802e98829-mcd-auth-proxy-config\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206611 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-k8s-cni-cncf-io\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206779 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-hostroot\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-socket-dir-parent\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.207268 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-binary-copy\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.207316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cni-binary-copy\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.207382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-daemon-config\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.208005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8702afd1-abd3-42d0-91e6-048802e98829-mcd-auth-proxy-config\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.209975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8702afd1-abd3-42d0-91e6-048802e98829-proxy-tls\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.215760 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226669 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.228493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxtbp\" (UniqueName: \"kubernetes.io/projected/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-kube-api-access-zxtbp\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.234977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwcq\" (UniqueName: \"kubernetes.io/projected/d4f0d908-7a54-4fb3-a52d-51d088632c62-kube-api-access-pvwcq\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.236291 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.238915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xmjs\" (UniqueName: \"kubernetes.io/projected/8702afd1-abd3-42d0-91e6-048802e98829-kube-api-access-7xmjs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.253308 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.274309 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.287282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.287915 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.298961 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.308525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.309315 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: W0320 17:19:11.322319 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8702afd1_abd3_42d0_91e6_048802e98829.slice/crio-f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec WatchSource:0}: Error finding container f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec: Status 404 returned error can't find the container with id f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.327182 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328389 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.347297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: W0320 17:19:11.348907 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c31a7c_6ccb_43e0_9c95_33b85204cc39.slice/crio-17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21 WatchSource:0}: Error finding container 17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21: Status 404 returned error can't find the container with id 17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21 Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.368560 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.370401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.372316 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.373877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.373972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.374210 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.374397 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.374403 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.375015 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.392996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.406915 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.408846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.408901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.408944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409786 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409814 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.422477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431647 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.433932 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.454273 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.469505 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.484227 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.501967 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510433 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511215 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511269 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511835 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511870 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512218 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.514533 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.515584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.526333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.530310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537401 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537453 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.542440 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.554249 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.640719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641054 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.661884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f47gv" event={"ID":"22ee11f2-6451-4d59-8c55-ffcb0ea973a1","Type":"ContainerStarted","Data":"ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.661937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f47gv" event={"ID":"22ee11f2-6451-4d59-8c55-ffcb0ea973a1","Type":"ContainerStarted","Data":"5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.663557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.663586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.666116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerStarted","Data":"73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.666154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerStarted","Data":"ceab78a93e5ffcad27bdfeb7a60afee5acc9056757c7694e82e32c5abe81a00c"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.669654 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.669706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.669720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.678612 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.689999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.695245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: W0320 17:19:11.711969 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod520bb74b_cfa2_4f21_b561_989b0a3d6adc.slice/crio-1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb WatchSource:0}: Error finding container 1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb: Status 404 returned error can't find the container with id 1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.715310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.732362 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744887 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.749059 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.766139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.778923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.796923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.813776 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.831759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.846879 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847511 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.858814 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.871012 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.881751 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.897447 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.908559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.923077 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.936810 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.947425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950173 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950208 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.963575 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.978839 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.993443 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.009967 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.022826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052651 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155338 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.251067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:12 crc kubenswrapper[4795]: E0320 17:19:12.251292 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.251164 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:12 crc kubenswrapper[4795]: E0320 17:19:12.251518 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.251096 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:12 crc kubenswrapper[4795]: E0320 17:19:12.251760 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258348 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361438 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465456 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.569127 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.676495 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" exitCode=0 Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.676576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.676598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.680185 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.684133 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca" exitCode=0 Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.684204 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.691208 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.715156 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.734261 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.747790 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.766448 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777237 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.779244 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.792625 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.809663 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.831343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.851585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.866399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.880963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.882498 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.893863 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.907128 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.919577 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.934122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.945640 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.962585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.975542 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983891 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.986991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.998068 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.011099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.023804 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.036024 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.086885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087312 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394428 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496854 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496897 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599435 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.689004 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158" exitCode=0 Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.689069 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698347 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701774 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.712667 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.741807 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.760069 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.787663 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805233 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.807462 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.828002 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.845767 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.868724 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.887303 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.900066 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907516 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.913648 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.928462 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009927 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112483 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.215980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216074 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.251658 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.251787 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.251665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:14 crc kubenswrapper[4795]: E0320 17:19:14.251843 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:14 crc kubenswrapper[4795]: E0320 17:19:14.251954 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:14 crc kubenswrapper[4795]: E0320 17:19:14.252108 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.269368 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319139 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421615 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524569 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629329 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.704575 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93" exitCode=0 Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.704653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.726592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731670 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.744635 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.762100 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.775629 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.799242 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.825643 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834348 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.842315 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.857739 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.870616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.892970 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.910440 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.924966 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938665 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.041518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.041812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.041966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.042116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.042247 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.144261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145544 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.248509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.248749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.248885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.249116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.249245 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.358737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.359451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.359649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.359913 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.360106 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.463959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.464267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.464462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.464919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.465107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568225 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671177 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.716386 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84" exitCode=0 Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.716534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.725047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.744895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.766478 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773232 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.791456 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.809060 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.833100 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.856210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.881354 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.896563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.911636 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.931901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.951974 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.970534 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978140 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.980620 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182137 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.251874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.251938 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.251974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:16 crc kubenswrapper[4795]: E0320 17:19:16.252061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:16 crc kubenswrapper[4795]: E0320 17:19:16.252272 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:16 crc kubenswrapper[4795]: E0320 17:19:16.252430 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291360 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394140 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497201 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604976 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708717 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.733278 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d" exitCode=0 Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.733324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.769260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.786891 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.803057 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811879 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.820192 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.863268 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.893910 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.908807 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.917466 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.931291 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.944325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.960086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.974607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.992580 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224131 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.284040 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.306336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.326853 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.328652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329518 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.345988 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pgsfb"] Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.346628 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.350036 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.350248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.353423 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.353455 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.358750 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.373406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13b60165-8101-45a9-91da-d6d1ba46a6cf-host\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.373486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vwz\" (UniqueName: \"kubernetes.io/projected/13b60165-8101-45a9-91da-d6d1ba46a6cf-kube-api-access-t5vwz\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.373539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13b60165-8101-45a9-91da-d6d1ba46a6cf-serviceca\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.375555 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.406615 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.427727 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.433598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.433853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.434002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.434169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.434296 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.447340 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.467953 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475067 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13b60165-8101-45a9-91da-d6d1ba46a6cf-host\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vwz\" (UniqueName: \"kubernetes.io/projected/13b60165-8101-45a9-91da-d6d1ba46a6cf-kube-api-access-t5vwz\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475213 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13b60165-8101-45a9-91da-d6d1ba46a6cf-serviceca\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475223 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13b60165-8101-45a9-91da-d6d1ba46a6cf-host\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.477328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13b60165-8101-45a9-91da-d6d1ba46a6cf-serviceca\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.486487 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.506625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vwz\" (UniqueName: \"kubernetes.io/projected/13b60165-8101-45a9-91da-d6d1ba46a6cf-kube-api-access-t5vwz\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.507201 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.528618 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537236 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.545642 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.570633 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.585649 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.603302 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.627299 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639549 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.641922 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.656073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.669115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.670624 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.683108 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: W0320 17:19:17.685852 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b60165_8101_45a9_91da_d6d1ba46a6cf.slice/crio-f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37 WatchSource:0}: Error finding container f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37: Status 404 returned error can't find the container with id f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37 Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.707011 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.729528 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.741149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744089 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea" exitCode=0 Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744994 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.748077 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pgsfb" event={"ID":"13b60165-8101-45a9-91da-d6d1ba46a6cf","Type":"ContainerStarted","Data":"f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.759363 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.772382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.786078 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.798920 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.813215 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.828089 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848353 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.855962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.871297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.890042 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.910705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.923304 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.933164 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949616 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.953043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.965992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.978827 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.991951 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.005350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052394 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156214 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.251138 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.251153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.251154 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.251260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.251391 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.251613 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259949 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363142 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466607 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569322 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588471 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.607584 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611465 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.625828 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629447 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.643859 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.658803 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.661951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.661983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.661994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.662009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.662021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.681997 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.682229 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688322 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.762380 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.764346 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.764418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.764554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.773621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerStarted","Data":"20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.776434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pgsfb" event={"ID":"13b60165-8101-45a9-91da-d6d1ba46a6cf","Type":"ContainerStarted","Data":"5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.785023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793485 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.804963 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.804940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.808138 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.823163 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.839247 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.858924 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.888933 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896727 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.909257 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.941880 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.961007 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.972075 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.981949 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.009164 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.025149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.036763 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.052382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.064540 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.092627 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101255 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.111999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.128035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.147413 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.160926 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.173463 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.204418 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205162 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.224432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.241862 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.263212 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.283178 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.302586 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.312994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313621 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416580 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624501 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.727733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728734 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.782148 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.804358 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.828347 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833241 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.850482 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.882148 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.906513 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.930404 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.953940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.981123 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.004265 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.018426 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.029895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.037983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038051 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.047517 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.059661 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.071921 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140723 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.243986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244128 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.251253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:20 crc kubenswrapper[4795]: E0320 17:19:20.251398 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.251480 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:20 crc kubenswrapper[4795]: E0320 17:19:20.251562 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.251640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:20 crc kubenswrapper[4795]: E0320 17:19:20.251743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347383 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.450598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.450993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.451020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.451038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.451055 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657891 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760157 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.808286 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/0.log" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.811463 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6" exitCode=1 Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.811519 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.812942 4795 scope.go:117] "RemoveContainer" containerID="4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.839588 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.856736 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862358 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.874051 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.890578 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.901316 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.920616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.934237 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.947176 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966194 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.970226 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.988959 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.004851 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.034150 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.045946 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.058038 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069325 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.252609 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:21 crc kubenswrapper[4795]: E0320 17:19:21.252877 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274401 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274440 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376501 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.478999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479099 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581640 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684317 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.786991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787314 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.818789 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/0.log" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.822108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.823017 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.840989 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.856210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.869469 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889625 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.899056 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.923372 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.939183 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.953825 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.973287 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.986860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992556 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.023291 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.039437 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.056940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.068874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.088085 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095702 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198409 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.251823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.251869 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.251868 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.251992 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.252163 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.252358 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405525 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611402 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817920 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.828118 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.829025 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/0.log" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.832862 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" exitCode=1 Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.832911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.832956 4795 scope.go:117] "RemoveContainer" containerID="4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.833960 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.834209 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.859638 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.880503 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.899806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.915842 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920796 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.940005 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.961117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.979898 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.002341 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024975 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.029379 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.065101 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.088863 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.108415 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131562 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.135282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.153484 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234781 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.334261 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp"] Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.335034 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.338483 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.338716 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339457 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.351996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.385553 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.407509 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.426860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443202 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.447851 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-env-overrides\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452239 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbgk\" (UniqueName: \"kubernetes.io/projected/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-kube-api-access-jtbgk\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.466975 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.489428 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.508865 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.529043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.545258 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546560 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-env-overrides\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbgk\" (UniqueName: \"kubernetes.io/projected/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-kube-api-access-jtbgk\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.554136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-env-overrides\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.554447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.568897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.579926 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.585757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbgk\" (UniqueName: \"kubernetes.io/projected/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-kube-api-access-jtbgk\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.604579 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.629188 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.648672 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649270 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.659100 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.663300 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: W0320 17:19:23.673994 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb75ab9_7e4b_411f_bebe_cf4e2016b031.slice/crio-f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579 WatchSource:0}: Error finding container f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579: Status 404 returned error can't find the container with id f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579 Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756166 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.837049 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" event={"ID":"3bb75ab9-7e4b-411f-bebe-cf4e2016b031","Type":"ContainerStarted","Data":"f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.839318 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.843484 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:23 crc kubenswrapper[4795]: E0320 17:19:23.843774 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859816 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.873330 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.892726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.910497 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.928884 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.946711 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.961354 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.989099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.004014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.016989 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.032593 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.046229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.067436 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.081382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.083138 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jpp4c"] Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.083639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.083735 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.096630 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.112427 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.126252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.135755 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.146726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.163530 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.163587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxqx\" (UniqueName: \"kubernetes.io/projected/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-kube-api-access-tmxqx\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.166563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171368 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.178009 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.191224 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.202796 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.220219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.241825 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.251890 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.251951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.252283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.251974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.252525 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.252306 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.258425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.264295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.264411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxqx\" (UniqueName: \"kubernetes.io/projected/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-kube-api-access-tmxqx\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.264734 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.264814 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:24.764793771 +0000 UTC m=+108.222825422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273375 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.279740 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.295125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.295307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxqx\" (UniqueName: \"kubernetes.io/projected/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-kube-api-access-tmxqx\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.308219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.320654 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.335750 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.349806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.375856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376410 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479404 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582919 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686178 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.770291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.770547 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.770633 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.770612736 +0000 UTC m=+109.228644317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793294 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.847935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" event={"ID":"3bb75ab9-7e4b-411f-bebe-cf4e2016b031","Type":"ContainerStarted","Data":"1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.848648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" event={"ID":"3bb75ab9-7e4b-411f-bebe-cf4e2016b031","Type":"ContainerStarted","Data":"1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.861650 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.874130 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.883625 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895483 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.896632 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.908440 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.921073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.953268 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.970762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.993230 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998189 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998308 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.009623 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.024029 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.040559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.054264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.078086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.095035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100934 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.111621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.203901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204340 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.306907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.306962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.306979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.307002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.307019 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410482 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513164 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615235 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717993 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.781295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:25 crc kubenswrapper[4795]: E0320 17:19:25.781568 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:25 crc kubenswrapper[4795]: E0320 17:19:25.781653 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:27.781623707 +0000 UTC m=+111.239655288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821388 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.924972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925101 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030613 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.083985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084194 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084153717 +0000 UTC m=+141.542185298 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.084277 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.084362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.084429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084435 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084641 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084675 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084734 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084529 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084733 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084707024 +0000 UTC m=+141.542738605 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084883 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084847317 +0000 UTC m=+141.542878898 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084912 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084897859 +0000 UTC m=+141.542929440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133608 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.185513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185804 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185833 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185852 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185933 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.185909907 +0000 UTC m=+141.643941489 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.236951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237070 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.251713 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.251791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.251893 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.252019 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.252060 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.252113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.252233 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.252311 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340328 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443648 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546443 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649486 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753627 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.855795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.063903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.063967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.063984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.064008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.064027 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270644 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.273174 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.290383 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.307162 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.329889 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.349918 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373742 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.392157 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.425139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.456936 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476626 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.477242 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.498034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.520130 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.535754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.553939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.575107 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579531 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.593154 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682842 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786859 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.804438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:27 crc kubenswrapper[4795]: E0320 17:19:27.804627 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:27 crc kubenswrapper[4795]: E0320 17:19:27.804760 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:31.804730903 +0000 UTC m=+115.262762474 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.992987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993086 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096798 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200189 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200331 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.251972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.252037 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.252068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.252065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252163 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252284 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252368 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252461 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303617 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303735 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408411 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408552 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511707 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.615755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619811 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703820 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.723539 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728756 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.743720 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.767800 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772728 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.796145 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801302 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.823497 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.823673 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825469 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032363 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135724 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.238974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239088 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.341905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.341969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.341988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.342012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.342031 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549509 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652764 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755843 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859797 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962568 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065789 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065811 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.168926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.168985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.169003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.169026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.169044 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251124 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251186 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251201 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251341 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251597 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251748 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271810 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.375012 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478270 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581526 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684909 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684951 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891512 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995162 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995291 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097817 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200525 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.303841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.303905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.303938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.304009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.304037 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407213 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510262 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613424 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716371 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819815 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.855639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:31 crc kubenswrapper[4795]: E0320 17:19:31.856292 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:31 crc kubenswrapper[4795]: E0320 17:19:31.856535 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:39.856487598 +0000 UTC m=+123.314519169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.922722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923296 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.025873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026425 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128997 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.231624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.231999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.232135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.232249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.232342 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251558 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251593 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.251996 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.252060 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.252113 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.252175 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335447 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438494 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541784 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644624 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748209 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851769 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954739 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058120 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.160925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.263910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.263966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.263983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.264004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.264022 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.366873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.366946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.366971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.367000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.367021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486430 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589364 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589425 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.692927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.692992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.693010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.693035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.693052 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797725 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900738 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003854 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.107993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108108 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211888 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252127 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.252254 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.252339 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.252494 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252081 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.253002 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.315012 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417845 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727292 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.830730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.831582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.831755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.831889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.832057 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935935 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.039446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.039818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.039978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.040116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.040243 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142887 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.251756 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.455955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456089 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663930 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767359 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870803 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.896419 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.900869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.901540 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.933824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.959003 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974982 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.983082 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.003236 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.035599 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.053973 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.073406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077676 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.093068 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.108509 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.129866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.148118 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.172250 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.180898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.180965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.180984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.181012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.181029 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.190754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.223488 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.246217 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251241 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251440 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251557 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.251551 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.251740 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.251776 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.252003 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.269847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284633 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388623 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491259 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593972 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696898 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.800013 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902942 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:37Z","lastTransitionTime":"2026-03-20T17:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109226 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:37Z","lastTransitionTime":"2026-03-20T17:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:37 crc kubenswrapper[4795]: E0320 17:19:37.210400 4795 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.252731 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.269360 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.295945 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.316055 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.336656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.352641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: E0320 17:19:37.376489 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.376351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.396411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.421698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.440824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.472651 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.492984 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.531172 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.556957 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.574980 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.595470 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.613939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.911828 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.919123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212"} Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.919723 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.942479 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.957745 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.972960 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.986401 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.002174 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.013721 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.042372 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.057405 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.085813 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.102346 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.129042 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.149167 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.162095 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.175025 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.195169 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.207260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.251725 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.251788 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.251817 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.251795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.251950 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.252039 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.252100 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.252280 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.926151 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.927118 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.933916 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" exitCode=1 Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.933980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212"} Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.934029 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.934718 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.934934 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.962370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.968010 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:38Z","lastTransitionTime":"2026-03-20T17:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.986131 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.986287 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991538 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:38Z","lastTransitionTime":"2026-03-20T17:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.010865 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.015983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016091 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:39Z","lastTransitionTime":"2026-03-20T17:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.033605 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.038774 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045742 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:39Z","lastTransitionTime":"2026-03-20T17:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.065236 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069448 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:39Z","lastTransitionTime":"2026-03-20T17:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.074522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.088218 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.089295 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.089382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.119954 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.142397 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.161514 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.180774 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.196791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.216176 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.237362 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.255253 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.275555 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.291011 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.940976 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.947349 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.947715 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.955630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.955865 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.955992 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:55.955957232 +0000 UTC m=+139.413988803 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.969406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.987087 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.021308 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.040510 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.066083 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.088149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.110987 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.131335 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.147309 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.180568 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.197246 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.215295 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.228386 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.246342 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251172 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251322 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251456 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251659 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.266085 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.283405 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252114 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252487 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252138 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252648 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252767 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.262672 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.378016 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252047 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252075 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252198 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252453 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252592 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252748 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.252430 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.253888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.253908 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.254055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.261008 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.261840 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.262025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.261789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.271102 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.293900 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.317855 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.348157 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.365076 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: E0320 17:19:47.378756 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.386429 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.402411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.419664 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.436246 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.451228 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.465806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.498010 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.517531 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.535537 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.553576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.572724 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.588339 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251772 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251742 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.251939 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.252057 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.252224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.252285 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428257 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.449050 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455256 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.478493 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483738 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.503955 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.508965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509108 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.528938 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.534010 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.554202 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.554914 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.251973 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.252010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.252136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252292 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.252327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252492 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252637 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252873 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.251673 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.251794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.251907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252088 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.252126 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252266 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252398 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252563 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.380636 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.979105 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.003437 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.019874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.039551 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.062326 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.080222 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.117018 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.134537 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.156857 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.175926 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.196344 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.214396 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.233175 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.248912 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.282511 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.304647 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.322587 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.340834 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251373 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.251514 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.251813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.251940 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.252122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.253167 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.253433 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:55 crc kubenswrapper[4795]: I0320 17:19:55.458897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.042497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.042738 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.043147 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:20:28.043119656 +0000 UTC m=+171.501151227 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.251738 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.251809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.251738 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.251947 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.252126 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.252285 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.252549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.252834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.268779 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.290570 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.310289 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.334478 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.353470 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: E0320 17:19:57.381484 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.389237 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.410647 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.426967 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.439132 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.452989 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.470417 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.489043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.501904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.532965 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.551610 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.569775 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.587312 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.608034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.018565 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/0.log" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.018626 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" containerID="e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d" exitCode=1 Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.018672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerDied","Data":"e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d"} Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.019115 4795 scope.go:117] "RemoveContainer" containerID="e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.038732 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.055083 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.068366 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.090507 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.102414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.136159 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.154900 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.164831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.164965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165029 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.165001083 +0000 UTC m=+205.623032634 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165111 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165132 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165148 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165215 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.165192918 +0000 UTC m=+205.623224549 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.165146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.165348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165236 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165498 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.165458266 +0000 UTC m=+205.623489857 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165500 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165605 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.16558398 +0000 UTC m=+205.623615621 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.177888 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.193607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.210450 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.219574 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.236776 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.250501 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251674 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.251885 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251984 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.252205 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.252382 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.252471 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.261096 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.266619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266778 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266794 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266805 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266846 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.266834196 +0000 UTC m=+205.724865737 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.269894 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.284154 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.300359 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.318420 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.335909 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.025245 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/0.log" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.025378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86"} Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.046271 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.061003 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.077132 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.097520 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.116625 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.141299 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.158956 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.190917 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.207893 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.229262 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.251876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.272099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.291833 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.311948 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.325634 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.346999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.360319 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.369376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.379367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881973 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.904091 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909205 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.933464 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938629 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.958568 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.963745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.963976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.964142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.964292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.964447 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.986027 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.990710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.990881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.991016 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.991162 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.991474 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.011579 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:00Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.011948 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.251789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251448 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.252050 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251402 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.252283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.252531 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251171 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251280 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251375 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251482 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251604 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251811 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.382448 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.251971 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.252020 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.252056 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.251993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252217 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252344 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252497 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:05 crc kubenswrapper[4795]: I0320 17:20:05.252315 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.053235 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.056614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.057296 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.070895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.084003 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.095352 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.110616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.122189 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.141313 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.154826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.174730 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.189593 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.202669 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.215367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.230023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.242201 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.251532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.251576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.251736 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.251828 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.251972 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.252013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.252106 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.252226 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.277576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.294920 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.315005 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.332642 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.355238 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.374098 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.069671 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.071101 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.075389 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" exitCode=1 Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.075439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.075487 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.076481 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:07 crc kubenswrapper[4795]: E0320 17:20:07.076839 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.091213 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.109241 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.121699 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.135343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.150830 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.164293 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.179571 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.196599 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.210876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.234134 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.263834 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.281096 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.294532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.308264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.321571 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.334338 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.347401 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.361865 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.375299 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: E0320 17:20:07.382825 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.387106 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.399655 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.408747 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.422924 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.448241 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.460475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.472882 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.482824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.492355 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.505577 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.516639 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.527847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.554562 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.567749 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.584248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.594873 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.608493 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.631602 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.646621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.078989 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.082755 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.082937 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.095856 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.111336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.127496 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.140404 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.169087 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.181525 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.195860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.208856 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.227589 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.245815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.251655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.251752 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.251809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.251982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.252140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.252411 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.252539 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.252615 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.262908 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.276526 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.307205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.327895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.342356 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.357381 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.376423 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.393608 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.409917 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.078186 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084297 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.099930 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.126569 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.147104 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.152049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.167165 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.167409 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252049 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252071 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252188 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252354 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252573 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252651 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252799 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251815 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.251855 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251781 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.252009 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.252029 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.252160 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.383981 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251216 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251408 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.251499 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.251717 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.251906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.252125 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.251145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.251303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.252263 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.252304 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.252458 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.252604 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.252856 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.253031 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.271917 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.293982 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.312672 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.329187 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.350903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.365370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: E0320 17:20:17.384522 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.388592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.406264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.423057 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.439485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.451986 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.471944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.484581 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.511793 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.527275 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.546181 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.561317 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.579952 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.592275 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251467 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251621 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.251798 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.251974 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.252353 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.252628 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:19 crc kubenswrapper[4795]: I0320 17:20:19.253047 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:19 crc kubenswrapper[4795]: E0320 17:20:19.253323 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251423 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251452 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.251508 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.251836 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.251952 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.252125 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354179 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354309 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.374207 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378915 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.398363 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403332 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.423555 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428808 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.450819 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456228 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.477125 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.477307 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252038 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252299 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.252333 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.252824 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.253133 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.253245 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.385742 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251540 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.253801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251865 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.253949 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.254090 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.254194 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.252186 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252457 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.252556 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.252950 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.253089 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.302263 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" podStartSLOduration=114.30223068 podStartE2EDuration="1m54.30223068s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.279648801 +0000 UTC m=+170.737680432" watchObservedRunningTime="2026-03-20 17:20:27.30223068 +0000 UTC m=+170.760262251" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.343567 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xxwb6" podStartSLOduration=114.343534349 podStartE2EDuration="1m54.343534349s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.325523681 +0000 UTC m=+170.783555262" watchObservedRunningTime="2026-03-20 17:20:27.343534349 +0000 UTC m=+170.801565920" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.360552 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f47gv" podStartSLOduration=114.360523276 podStartE2EDuration="1m54.360523276s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.344233991 +0000 UTC m=+170.802265562" watchObservedRunningTime="2026-03-20 17:20:27.360523276 +0000 UTC m=+170.818554857" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.361099 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.361089233 podStartE2EDuration="29.361089233s" podCreationTimestamp="2026-03-20 17:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.360362791 +0000 UTC m=+170.818394372" watchObservedRunningTime="2026-03-20 17:20:27.361089233 +0000 UTC m=+170.819120804" Mar 20 17:20:27 crc kubenswrapper[4795]: E0320 17:20:27.386631 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.469401 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podStartSLOduration=114.469375587 podStartE2EDuration="1m54.469375587s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.468554081 +0000 UTC m=+170.926585642" watchObservedRunningTime="2026-03-20 17:20:27.469375587 +0000 UTC m=+170.927407178" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.469761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" podStartSLOduration=114.469752378 podStartE2EDuration="1m54.469752378s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.447336954 +0000 UTC m=+170.905368535" watchObservedRunningTime="2026-03-20 17:20:27.469752378 +0000 UTC m=+170.927783959" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.535361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.535343 podStartE2EDuration="1m21.535343s" podCreationTimestamp="2026-03-20 17:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.534797023 +0000 UTC m=+170.992828604" watchObservedRunningTime="2026-03-20 17:20:27.535343 +0000 UTC m=+170.993374551" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.552194 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=32.552173031 podStartE2EDuration="32.552173031s" podCreationTimestamp="2026-03-20 17:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.550806469 +0000 UTC m=+171.008838030" watchObservedRunningTime="2026-03-20 17:20:27.552173031 +0000 UTC m=+171.010204612" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.643285 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pgsfb" podStartSLOduration=114.643264212 podStartE2EDuration="1m54.643264212s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.642506638 +0000 UTC m=+171.100538219" watchObservedRunningTime="2026-03-20 17:20:27.643264212 +0000 UTC m=+171.101295753" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.669889 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=73.669872226 podStartE2EDuration="1m13.669872226s" podCreationTimestamp="2026-03-20 17:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.668340899 +0000 UTC m=+171.126372450" watchObservedRunningTime="2026-03-20 17:20:27.669872226 +0000 UTC m=+171.127903767" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.682717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.682668803 podStartE2EDuration="45.682668803s" podCreationTimestamp="2026-03-20 17:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.682034593 +0000 UTC m=+171.140066144" watchObservedRunningTime="2026-03-20 17:20:27.682668803 +0000 UTC m=+171.140700354" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.129879 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.130143 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.130256 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:32.130228264 +0000 UTC m=+235.588259835 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.251653 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.251669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.251785 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.251969 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.252046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.252147 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.252233 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.252743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252226 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252340 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252438 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252615 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252820 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688912 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:30Z","lastTransitionTime":"2026-03-20T17:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.757351 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm"] Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.757901 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.761657 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.761922 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.762078 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.766739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.853964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6c553b-299b-4aaf-945a-81fc44d50569-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e6c553b-299b-4aaf-945a-81fc44d50569-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6c553b-299b-4aaf-945a-81fc44d50569-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e6c553b-299b-4aaf-945a-81fc44d50569-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6c553b-299b-4aaf-945a-81fc44d50569-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955547 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.956075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6c553b-299b-4aaf-945a-81fc44d50569-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.956167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.956259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.957243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e6c553b-299b-4aaf-945a-81fc44d50569-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.966677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6c553b-299b-4aaf-945a-81fc44d50569-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.987020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6c553b-299b-4aaf-945a-81fc44d50569-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.081106 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:31 crc kubenswrapper[4795]: W0320 17:20:31.106370 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6c553b_299b_4aaf_945a_81fc44d50569.slice/crio-77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453 WatchSource:0}: Error finding container 77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453: Status 404 returned error can't find the container with id 77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453 Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.168984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" event={"ID":"9e6c553b-299b-4aaf-945a-81fc44d50569","Type":"ContainerStarted","Data":"77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453"} Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.284822 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.292800 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.174045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" event={"ID":"9e6c553b-299b-4aaf-945a-81fc44d50569","Type":"ContainerStarted","Data":"b80a75e01e186a29a1d36b898a85a77f2379e025a1bd6957f193a1fe3ca00dc8"} Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.191616 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" podStartSLOduration=119.191591175 podStartE2EDuration="1m59.191591175s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:32.190863123 +0000 UTC m=+175.648894704" watchObservedRunningTime="2026-03-20 17:20:32.191591175 +0000 UTC m=+175.649622746" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.251883 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.251980 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.252008 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.251883 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252176 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252435 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.388444 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:33 crc kubenswrapper[4795]: I0320 17:20:33.253134 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:33 crc kubenswrapper[4795]: E0320 17:20:33.253471 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.251831 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.251903 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.251837 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.251997 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.252105 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.252149 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.252249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.252477 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.251861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.251950 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.251957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.252083 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252077 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252258 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252463 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252523 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:37 crc kubenswrapper[4795]: E0320 17:20:37.389067 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251269 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251405 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251501 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251487 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252039 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251575 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251536 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252218 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252340 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252461 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251851 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252067 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252262 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252358 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252422 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.390743 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.223598 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224288 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/0.log" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224335 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" exitCode=1 Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerDied","Data":"c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86"} Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224420 4795 scope.go:117] "RemoveContainer" containerID="e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.225131 4795 scope.go:117] "RemoveContainer" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.225433 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xxwb6_openshift-multus(c8c31a7c-6ccb-43e0-9c95-33b85204cc39)\"" pod="openshift-multus/multus-xxwb6" podUID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.252086 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.252311 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.252669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.252836 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.253129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.253257 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.253490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.253623 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:45 crc kubenswrapper[4795]: I0320 17:20:45.230999 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:20:45 crc kubenswrapper[4795]: I0320 17:20:45.252852 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:45 crc kubenswrapper[4795]: E0320 17:20:45.253133 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.251083 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.251197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.251252 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.251334 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.251628 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.251725 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.253117 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.253487 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:47 crc kubenswrapper[4795]: E0320 17:20:47.391398 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251459 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.251607 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251486 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.251901 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.252019 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.252061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252023 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252204 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252386 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252533 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252651 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251596 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.251831 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251862 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.252034 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.252164 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.252435 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.392517 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.251953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.252017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.252162 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.252297 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.252660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.252976 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.252934 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.253094 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252182 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.252543 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.252805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.252985 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.253119 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:57 crc kubenswrapper[4795]: I0320 17:20:57.253990 4795 scope.go:117] "RemoveContainer" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" Mar 20 17:20:57 crc kubenswrapper[4795]: E0320 17:20:57.393247 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251440 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251446 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251557 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.251966 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.251802 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.251659 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.252122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.282161 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.282263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7"} Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.251314 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.251393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.251523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.251778 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.252373 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.252568 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.252876 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.253045 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.253296 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.189851 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpp4c"] Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.190002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:01 crc kubenswrapper[4795]: E0320 17:21:01.190153 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.299222 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.302767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.303346 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.343934 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podStartSLOduration=148.343903467 podStartE2EDuration="2m28.343903467s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:01.342869101 +0000 UTC m=+204.800900732" watchObservedRunningTime="2026-03-20 17:21:01.343903467 +0000 UTC m=+204.801935048" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216071 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216369 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.21633232 +0000 UTC m=+327.674363901 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216628 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216763 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.216747174 +0000 UTC m=+327.674778745 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216901 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216928 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216920 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216946 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.217069 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.217033614 +0000 UTC m=+327.675065265 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.217147 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.217128098 +0000 UTC m=+327.675159669 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.251285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.251399 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.251302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.251509 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.251651 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.251810 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.317579 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.317888 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.317923 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.317943 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.318053 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.31802719 +0000 UTC m=+327.776058761 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.395197 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:21:03 crc kubenswrapper[4795]: I0320 17:21:03.251358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:03 crc kubenswrapper[4795]: E0320 17:21:03.251571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:04 crc kubenswrapper[4795]: I0320 17:21:04.252042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:04 crc kubenswrapper[4795]: I0320 17:21:04.252048 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:04 crc kubenswrapper[4795]: I0320 17:21:04.252238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:04 crc kubenswrapper[4795]: E0320 17:21:04.252245 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:04 crc kubenswrapper[4795]: E0320 17:21:04.253076 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:04 crc kubenswrapper[4795]: E0320 17:21:04.253403 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:05 crc kubenswrapper[4795]: I0320 17:21:05.252144 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:05 crc kubenswrapper[4795]: E0320 17:21:05.252313 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:06 crc kubenswrapper[4795]: I0320 17:21:06.252274 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:06 crc kubenswrapper[4795]: I0320 17:21:06.252396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:06 crc kubenswrapper[4795]: I0320 17:21:06.252281 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:06 crc kubenswrapper[4795]: E0320 17:21:06.252473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:06 crc kubenswrapper[4795]: E0320 17:21:06.252579 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:06 crc kubenswrapper[4795]: E0320 17:21:06.252732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:07 crc kubenswrapper[4795]: I0320 17:21:07.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:07 crc kubenswrapper[4795]: E0320 17:21:07.253609 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.251265 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.251366 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.251367 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.253743 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.253775 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.254389 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.254490 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:21:09 crc kubenswrapper[4795]: I0320 17:21:09.252109 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:09 crc kubenswrapper[4795]: I0320 17:21:09.262023 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:21:09 crc kubenswrapper[4795]: I0320 17:21:09.262459 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.301098 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.301460 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.526510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.597632 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bl2bp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.598817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.610001 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.611113 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.612040 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.619468 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.621538 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.621900 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.622924 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.623160 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.623423 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.636427 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.637508 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.638350 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.639077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.639339 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.639780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.641394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.641721 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.642354 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-p5hmr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.642634 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.643076 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.643084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.644326 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.648495 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xzx7n"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.649201 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.650052 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.652170 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.652843 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.653965 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.654265 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.654374 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.654953 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.655526 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.655736 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656019 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656203 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656731 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656996 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.657117 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.658010 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.658453 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.661810 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.662774 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.663618 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97wlq"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.664401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.665084 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5l8ml"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.665566 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.666189 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gplds"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.666823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.669230 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.669670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.670862 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.671387 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.673310 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.674024 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.677814 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.678258 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-45pjp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.678442 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.678756 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.685544 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686010 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686675 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686990 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.687159 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.688445 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.688733 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.688959 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.689237 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.692910 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.693272 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.693825 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.694130 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.694238 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.703353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.693920 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.704133 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.704869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.705015 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.706346 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.706789 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.707420 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.711023 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.708799 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712435 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710642 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrcc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.714570 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709240 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709307 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709436 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709614 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709659 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710386 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710449 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712300 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.714991 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.715311 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.715672 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.715912 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.717114 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.717363 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.730854 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.731068 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.731248 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.731419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.736576 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.736948 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.737606 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.737801 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.737887 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.738052 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.738163 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.738892 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.739976 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740247 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-images\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-audit\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740326 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740384 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-node-pullsecrets\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740531 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740799 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.741432 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-encryption-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-serving-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742510 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-image-import-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8l7c\" (UniqueName: \"kubernetes.io/projected/9f31b9ac-9447-4b20-ac60-7532edfa4600-kube-api-access-q8l7c\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742578 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-client\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-audit-dir\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-serving-cert\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f31b9ac-9447-4b20-ac60-7532edfa4600-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9ptd\" (UniqueName: \"kubernetes.io/projected/5c21571e-5513-46e0-9eed-4ec64df8e445-kube-api-access-w9ptd\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742853 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-config\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.743025 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.743145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.743361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.744028 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.744945 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.746698 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.748890 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.749535 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.749631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.751831 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bl2bp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.752239 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.753051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.753814 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.754007 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.754081 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.754234 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.758193 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.760338 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.760642 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lrxrs"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761137 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761252 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761763 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.763942 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.764452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.781769 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.782762 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.784380 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.785572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xzx7n"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.789333 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.789723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gpx9r"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.790459 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.794341 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.794886 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.795534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.796447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.796564 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.797019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.797768 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.798616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.798847 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.800070 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.800403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802041 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802193 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c49vv"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802520 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802855 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.803578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.804863 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.805632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.806008 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.806741 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.806787 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.807592 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.809291 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgsw2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.809778 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810089 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.811430 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.811916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.812628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.812989 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.813889 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8v58t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.814572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.815843 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-45pjp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.816866 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.818086 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.819391 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.820811 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.821105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.822410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5l8ml"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.823725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.825492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrcc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.826633 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-p5hmr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.828267 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.829560 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.830560 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.831532 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.832711 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c49vv"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.834057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.835123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.836473 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.837763 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97wlq"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.838733 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.840906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.841204 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.843305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92f4\" (UniqueName: \"kubernetes.io/projected/0415738e-f327-433a-9a28-0a991138e021-kube-api-access-n92f4\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-image-import-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8l7c\" (UniqueName: \"kubernetes.io/projected/9f31b9ac-9447-4b20-ac60-7532edfa4600-kube-api-access-q8l7c\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89cb82f-a141-419f-bf33-93c219c84e51-metrics-tls\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-client\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5scj\" (UniqueName: \"kubernetes.io/projected/2dbe21c7-d209-4259-b51d-b486b741e9c7-kube-api-access-c5scj\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-audit-dir\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844631 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f292b8-878f-418e-8c85-2f7818e9dba1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-machine-approver-tls\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844725 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-serving-cert\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-audit-dir\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.845129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-serving-cert\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.845171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f31b9ac-9447-4b20-ac60-7532edfa4600-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.845717 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-image-import-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9ptd\" (UniqueName: \"kubernetes.io/projected/5c21571e-5513-46e0-9eed-4ec64df8e445-kube-api-access-w9ptd\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-encryption-config\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f292b8-878f-418e-8c85-2f7818e9dba1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846442 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89cb82f-a141-419f-bf33-93c219c84e51-trusted-ca\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0415738e-f327-433a-9a28-0a991138e021-audit-dir\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-config\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846607 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dbe21c7-d209-4259-b51d-b486b741e9c7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846632 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-images\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6pt\" (UniqueName: \"kubernetes.io/projected/e6f292b8-878f-418e-8c85-2f7818e9dba1-kube-api-access-7n6pt\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-audit\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847722 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.851113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.848212 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-images\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.848819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-audit\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.850750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-client\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.849231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.850797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.852169 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.852520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.853056 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.853134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-config\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.854518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gvs\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-kube-api-access-g5gvs\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.854646 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gpx9r"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.854894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.855884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-auth-proxy-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857631 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb25j\" (UniqueName: \"kubernetes.io/projected/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-kube-api-access-lb25j\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-etcd-client\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-encryption-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-node-pullsecrets\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857894 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-audit-policies\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-serving-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbe21c7-d209-4259-b51d-b486b741e9c7-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.858411 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.858495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-node-pullsecrets\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.858760 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.859283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.859390 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.859830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-serving-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.860817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.861202 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f31b9ac-9447-4b20-ac60-7532edfa4600-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.861223 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.861766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-encryption-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.863134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-serving-cert\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.863438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.864020 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.865303 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.866448 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgsw2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.867831 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.868982 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-454wp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.870837 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.870916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.872928 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5v822"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.873893 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.874264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.875485 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.876903 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-454wp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.878575 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5v822"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.880049 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gdx4t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.881600 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.882327 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdx4t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.882403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.900921 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.921126 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.941350 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.958669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-encryption-config\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.958846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f292b8-878f-418e-8c85-2f7818e9dba1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.958946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0415738e-f327-433a-9a28-0a991138e021-audit-dir\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89cb82f-a141-419f-bf33-93c219c84e51-trusted-ca\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dbe21c7-d209-4259-b51d-b486b741e9c7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959614 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dbe21c7-d209-4259-b51d-b486b741e9c7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959741 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6pt\" (UniqueName: \"kubernetes.io/projected/e6f292b8-878f-418e-8c85-2f7818e9dba1-kube-api-access-7n6pt\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5gvs\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-kube-api-access-g5gvs\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-auth-proxy-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb25j\" (UniqueName: \"kubernetes.io/projected/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-kube-api-access-lb25j\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960102 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-etcd-client\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-audit-policies\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960164 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbe21c7-d209-4259-b51d-b486b741e9c7-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960181 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960195 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92f4\" (UniqueName: \"kubernetes.io/projected/0415738e-f327-433a-9a28-0a991138e021-kube-api-access-n92f4\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959107 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0415738e-f327-433a-9a28-0a991138e021-audit-dir\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89cb82f-a141-419f-bf33-93c219c84e51-metrics-tls\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960287 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5scj\" (UniqueName: \"kubernetes.io/projected/2dbe21c7-d209-4259-b51d-b486b741e9c7-kube-api-access-c5scj\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f292b8-878f-418e-8c85-2f7818e9dba1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-serving-cert\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-machine-approver-tls\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-auth-proxy-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960955 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.961044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-encryption-config\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.961440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.962162 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.962197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-audit-policies\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.962258 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f292b8-878f-418e-8c85-2f7818e9dba1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.963206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89cb82f-a141-419f-bf33-93c219c84e51-trusted-ca\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.963475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-machine-approver-tls\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.964070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-etcd-client\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.964495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89cb82f-a141-419f-bf33-93c219c84e51-metrics-tls\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.964985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-serving-cert\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.965261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbe21c7-d209-4259-b51d-b486b741e9c7-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.973427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f292b8-878f-418e-8c85-2f7818e9dba1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.982145 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.003929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.021929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.041380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.061557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.081604 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.101210 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.120841 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.162263 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.181349 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.202779 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.222951 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.253926 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.261472 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.282012 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.301997 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.322586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.342817 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.362466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.382001 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.402491 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.421041 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.442641 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.462300 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.482167 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.503876 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.541834 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.563193 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.582232 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.601032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.621969 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.641616 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.662038 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.682771 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.702448 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.721816 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.741533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.761811 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.782644 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.802977 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.820459 4795 request.go:700] Waited for 1.016499818s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmcc-proxy-tls&limit=500&resourceVersion=0 Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.822815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.842129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.862163 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.883075 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.902807 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.922495 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.942511 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.961880 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.003586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.003950 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.021290 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.042042 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.062600 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.081880 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.103889 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.121834 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.142094 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.162476 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.181945 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.203208 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.221927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.241991 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.261929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.282197 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.302089 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.322233 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.342384 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.362520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.382340 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.402468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.421950 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.441909 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.461835 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.481764 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.502099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.551091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8l7c\" (UniqueName: \"kubernetes.io/projected/9f31b9ac-9447-4b20-ac60-7532edfa4600-kube-api-access-q8l7c\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.571652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9ptd\" (UniqueName: \"kubernetes.io/projected/5c21571e-5513-46e0-9eed-4ec64df8e445-kube-api-access-w9ptd\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.589331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.604100 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.609104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.622649 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.642303 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.662651 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.682303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.701687 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.722534 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.741636 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.741783 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.761973 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.765017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.781923 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.786662 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.803749 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.835197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.840207 4795 request.go:700] Waited for 1.879866724s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.849764 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5gvs\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-kube-api-access-g5gvs\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.878402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6pt\" (UniqueName: \"kubernetes.io/projected/e6f292b8-878f-418e-8c85-2f7818e9dba1-kube-api-access-7n6pt\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.902339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.904690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb25j\" (UniqueName: \"kubernetes.io/projected/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-kube-api-access-lb25j\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.908829 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92f4\" (UniqueName: \"kubernetes.io/projected/0415738e-f327-433a-9a28-0a991138e021-kube-api-access-n92f4\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.926592 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.933296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5scj\" (UniqueName: \"kubernetes.io/projected/2dbe21c7-d209-4259-b51d-b486b741e9c7-kube-api-access-c5scj\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.951187 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983554 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0589e639-75bf-4a26-a80b-dbb69a6c9955-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984075 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2spg\" (UniqueName: \"kubernetes.io/projected/e5e80a44-9bdc-4321-9536-8eba4527f181-kube-api-access-s2spg\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmx2\" (UniqueName: \"kubernetes.io/projected/5c603995-8326-4bea-892a-74ee1e8c8dea-kube-api-access-bzmx2\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr9b7\" (UniqueName: \"kubernetes.io/projected/109e018a-9ad5-40e6-bd49-07d49d718161-kube-api-access-cr9b7\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-serving-cert\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427xq\" (UniqueName: \"kubernetes.io/projected/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-kube-api-access-427xq\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-serving-cert\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e639-75bf-4a26-a80b-dbb69a6c9955-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984444 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/32dc8fa2-0199-444e-9983-4af0fb9172b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shn6h\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-kube-api-access-shn6h\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-config\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: E0320 17:21:13.985261 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.485247731 +0000 UTC m=+217.943279282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985356 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-trusted-ca\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mhf6\" (UniqueName: \"kubernetes.io/projected/7074cf98-12f4-4a73-ad96-4959f64398a7-kube-api-access-5mhf6\") pod \"downloads-7954f5f757-xzx7n\" (UID: \"7074cf98-12f4-4a73-ad96-4959f64398a7\") " pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2669j\" (UniqueName: \"kubernetes.io/projected/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-kube-api-access-2669j\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985851 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2qp\" (UniqueName: \"kubernetes.io/projected/32dc8fa2-0199-444e-9983-4af0fb9172b1-kube-api-access-8c2qp\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-config\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-client\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/109e018a-9ad5-40e6-bd49-07d49d718161-metrics-tls\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-config\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c603995-8326-4bea-892a-74ee1e8c8dea-serving-cert\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.998207 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.004103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.064848 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bl2bp"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087372 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087501 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe2960a0-9218-4d46-8c50-7285c5e27882-proxy-tls\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587836f8-b700-43d0-940e-81d7820b2a6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-config-volume\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-srv-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-images\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmx2\" (UniqueName: \"kubernetes.io/projected/5c603995-8326-4bea-892a-74ee1e8c8dea-kube-api-access-bzmx2\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"auto-csr-approver-29567120-j7789\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbs2f\" (UniqueName: \"kubernetes.io/projected/5611db8a-18df-426e-a6e7-7f6720da4109-kube-api-access-nbs2f\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-stats-auth\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587836f8-b700-43d0-940e-81d7820b2a6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-metrics-certs\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087773 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jmdf\" (UniqueName: \"kubernetes.io/projected/661df377-ed57-4f75-9be9-3fc5f87cf37e-kube-api-access-4jmdf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqj4\" (UniqueName: \"kubernetes.io/projected/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-kube-api-access-wvqj4\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-serving-cert\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shn6h\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-kube-api-access-shn6h\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0068e5c-7377-479d-9cc5-fd1270c74b33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-webhook-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xln6k\" (UniqueName: \"kubernetes.io/projected/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-kube-api-access-xln6k\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088016 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-profile-collector-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2qp\" (UniqueName: \"kubernetes.io/projected/32dc8fa2-0199-444e-9983-4af0fb9172b1-kube-api-access-8c2qp\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-client\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088168 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbc5q\" (UniqueName: \"kubernetes.io/projected/ff31d5af-4eae-43e7-8512-c6f5d54501e1-kube-api-access-lbc5q\") pod \"migrator-59844c95c7-q9zkh\" (UID: \"ff31d5af-4eae-43e7-8512-c6f5d54501e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a61cce87-0b4f-4886-a347-b98aecad272a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-config\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2da6bf3-e17d-4adf-96dd-ea097cae192b-proxy-tls\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c603995-8326-4bea-892a-74ee1e8c8dea-serving-cert\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/661df377-ed57-4f75-9be9-3fc5f87cf37e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-registration-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc57139-c6ad-4639-a09f-d07f8da49f4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-node-bootstrap-token\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfz4k\" (UniqueName: \"kubernetes.io/projected/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-kube-api-access-jfz4k\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-metrics-tls\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088545 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0589e639-75bf-4a26-a80b-dbb69a6c9955-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtbkf\" (UniqueName: \"kubernetes.io/projected/7479d10c-1c3b-497e-8dda-07cd22aeccf0-kube-api-access-qtbkf\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2spg\" (UniqueName: \"kubernetes.io/projected/e5e80a44-9bdc-4321-9536-8eba4527f181-kube-api-access-s2spg\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5611db8a-18df-426e-a6e7-7f6720da4109-tmpfs\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlnwq\" (UniqueName: \"kubernetes.io/projected/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-kube-api-access-qlnwq\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-default-certificate\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090284 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-mountpoint-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090413 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr9b7\" (UniqueName: \"kubernetes.io/projected/109e018a-9ad5-40e6-bd49-07d49d718161-kube-api-access-cr9b7\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428476fd-a8f2-4ffc-bda6-f19da80778ac-cert\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-serving-cert\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427xq\" (UniqueName: \"kubernetes.io/projected/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-kube-api-access-427xq\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-serving-cert\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e639-75bf-4a26-a80b-dbb69a6c9955-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/32dc8fa2-0199-444e-9983-4af0fb9172b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090688 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-certs\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/661df377-ed57-4f75-9be9-3fc5f87cf37e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090781 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-config\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jpd\" (UniqueName: \"kubernetes.io/projected/b4698fe3-a607-4978-bad4-5b83d3beb21b-kube-api-access-k2jpd\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bphg\" (UniqueName: \"kubernetes.io/projected/489757b2-0de4-4275-8931-daa5c3b4a75a-kube-api-access-2bphg\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091014 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-trusted-ca\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mhf6\" (UniqueName: \"kubernetes.io/projected/7074cf98-12f4-4a73-ad96-4959f64398a7-kube-api-access-5mhf6\") pod \"downloads-7954f5f757-xzx7n\" (UID: \"7074cf98-12f4-4a73-ad96-4959f64398a7\") " pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd81e-16ce-4b5c-8667-85c9426a9221-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsz6b\" (UniqueName: \"kubernetes.io/projected/ae2bd81e-16ce-4b5c-8667-85c9426a9221-kube-api-access-gsz6b\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091142 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091165 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2669j\" (UniqueName: \"kubernetes.io/projected/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-kube-api-access-2669j\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091190 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-apiservice-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7dc57139-c6ad-4639-a09f-d07f8da49f4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7c4k\" (UniqueName: \"kubernetes.io/projected/067aa008-8dda-4bfe-bfd2-388abdb54299-kube-api-access-h7c4k\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091330 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fj9p\" (UniqueName: \"kubernetes.io/projected/a61cce87-0b4f-4886-a347-b98aecad272a-kube-api-access-8fj9p\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-config\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0068e5c-7377-479d-9cc5-fd1270c74b33-config\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.091478 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.591461754 +0000 UTC m=+218.049493295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091646 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-srv-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0068e5c-7377-479d-9cc5-fd1270c74b33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/109e018a-9ad5-40e6-bd49-07d49d718161-metrics-tls\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091799 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-key\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9cd4\" (UniqueName: \"kubernetes.io/projected/fe2960a0-9218-4d46-8c50-7285c5e27882-kube-api-access-m9cd4\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-config\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pxd\" (UniqueName: \"kubernetes.io/projected/428476fd-a8f2-4ffc-bda6-f19da80778ac-kube-api-access-c2pxd\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-plugins-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091990 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587836f8-b700-43d0-940e-81d7820b2a6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093479 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg7w2\" (UniqueName: \"kubernetes.io/projected/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-kube-api-access-mg7w2\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/067aa008-8dda-4bfe-bfd2-388abdb54299-service-ca-bundle\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093718 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc57139-c6ad-4639-a09f-d07f8da49f4e-config\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2da6bf3-e17d-4adf-96dd-ea097cae192b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgxw\" (UniqueName: \"kubernetes.io/projected/d2da6bf3-e17d-4adf-96dd-ea097cae192b-kube-api-access-tmgxw\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-csi-data-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-config\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.095505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.095657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.095979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.096534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.096575 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.096664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.097091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-config\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.097657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-trusted-ca\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.098109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.098621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.099672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.099709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-client\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0589e639-75bf-4a26-a80b-dbb69a6c9955-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-socket-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-config\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.101218 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.101668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.103187 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/109e018a-9ad5-40e6-bd49-07d49d718161-metrics-tls\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.104557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-serving-cert\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.105821 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e639-75bf-4a26-a80b-dbb69a6c9955-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.106342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/32dc8fa2-0199-444e-9983-4af0fb9172b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.106608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c603995-8326-4bea-892a-74ee1e8c8dea-serving-cert\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.106984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111518 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111626 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-serving-cert\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.112322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.112591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.117965 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2spg\" (UniqueName: \"kubernetes.io/projected/e5e80a44-9bdc-4321-9536-8eba4527f181-kube-api-access-s2spg\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.132302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-p5hmr"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.136059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shn6h\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-kube-api-access-shn6h\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.145790 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.159803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.173330 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f31b9ac_9447_4b20_ac60_7532edfa4600.slice/crio-bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b WatchSource:0}: Error finding container bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b: Status 404 returned error can't find the container with id bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.180134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2qp\" (UniqueName: \"kubernetes.io/projected/32dc8fa2-0199-444e-9983-4af0fb9172b1-kube-api-access-8c2qp\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.180948 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.193819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.199080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-profile-collector-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbc5q\" (UniqueName: \"kubernetes.io/projected/ff31d5af-4eae-43e7-8512-c6f5d54501e1-kube-api-access-lbc5q\") pod \"migrator-59844c95c7-q9zkh\" (UID: \"ff31d5af-4eae-43e7-8512-c6f5d54501e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a61cce87-0b4f-4886-a347-b98aecad272a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-config\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2da6bf3-e17d-4adf-96dd-ea097cae192b-proxy-tls\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/661df377-ed57-4f75-9be9-3fc5f87cf37e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-registration-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201423 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc57139-c6ad-4639-a09f-d07f8da49f4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-node-bootstrap-token\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201454 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfz4k\" (UniqueName: \"kubernetes.io/projected/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-kube-api-access-jfz4k\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-metrics-tls\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbkf\" (UniqueName: \"kubernetes.io/projected/7479d10c-1c3b-497e-8dda-07cd22aeccf0-kube-api-access-qtbkf\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5611db8a-18df-426e-a6e7-7f6720da4109-tmpfs\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlnwq\" (UniqueName: \"kubernetes.io/projected/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-kube-api-access-qlnwq\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201555 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-default-certificate\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-mountpoint-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428476fd-a8f2-4ffc-bda6-f19da80778ac-cert\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-certs\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/661df377-ed57-4f75-9be9-3fc5f87cf37e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201652 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jpd\" (UniqueName: \"kubernetes.io/projected/b4698fe3-a607-4978-bad4-5b83d3beb21b-kube-api-access-k2jpd\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bphg\" (UniqueName: \"kubernetes.io/projected/489757b2-0de4-4275-8931-daa5c3b4a75a-kube-api-access-2bphg\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201973 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd81e-16ce-4b5c-8667-85c9426a9221-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsz6b\" (UniqueName: \"kubernetes.io/projected/ae2bd81e-16ce-4b5c-8667-85c9426a9221-kube-api-access-gsz6b\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202042 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-apiservice-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7dc57139-c6ad-4639-a09f-d07f8da49f4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7c4k\" (UniqueName: \"kubernetes.io/projected/067aa008-8dda-4bfe-bfd2-388abdb54299-kube-api-access-h7c4k\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fj9p\" (UniqueName: \"kubernetes.io/projected/a61cce87-0b4f-4886-a347-b98aecad272a-kube-api-access-8fj9p\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0068e5c-7377-479d-9cc5-fd1270c74b33-config\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-srv-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0068e5c-7377-479d-9cc5-fd1270c74b33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-key\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202174 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9cd4\" (UniqueName: \"kubernetes.io/projected/fe2960a0-9218-4d46-8c50-7285c5e27882-kube-api-access-m9cd4\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pxd\" (UniqueName: \"kubernetes.io/projected/428476fd-a8f2-4ffc-bda6-f19da80778ac-kube-api-access-c2pxd\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-plugins-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587836f8-b700-43d0-940e-81d7820b2a6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg7w2\" (UniqueName: \"kubernetes.io/projected/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-kube-api-access-mg7w2\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/067aa008-8dda-4bfe-bfd2-388abdb54299-service-ca-bundle\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202287 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc57139-c6ad-4639-a09f-d07f8da49f4e-config\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2da6bf3-e17d-4adf-96dd-ea097cae192b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgxw\" (UniqueName: \"kubernetes.io/projected/d2da6bf3-e17d-4adf-96dd-ea097cae192b-kube-api-access-tmgxw\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-csi-data-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-socket-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe2960a0-9218-4d46-8c50-7285c5e27882-proxy-tls\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587836f8-b700-43d0-940e-81d7820b2a6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-config-volume\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-srv-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-images\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"auto-csr-approver-29567120-j7789\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbs2f\" (UniqueName: \"kubernetes.io/projected/5611db8a-18df-426e-a6e7-7f6720da4109-kube-api-access-nbs2f\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-stats-auth\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587836f8-b700-43d0-940e-81d7820b2a6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-metrics-certs\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jmdf\" (UniqueName: \"kubernetes.io/projected/661df377-ed57-4f75-9be9-3fc5f87cf37e-kube-api-access-4jmdf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqj4\" (UniqueName: \"kubernetes.io/projected/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-kube-api-access-wvqj4\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-serving-cert\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0068e5c-7377-479d-9cc5-fd1270c74b33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-webhook-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xln6k\" (UniqueName: \"kubernetes.io/projected/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-kube-api-access-xln6k\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.203622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-registration-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.203844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/067aa008-8dda-4bfe-bfd2-388abdb54299-service-ca-bundle\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.204332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-mountpoint-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.205394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-profile-collector-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.205904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.206045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.206578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc57139-c6ad-4639-a09f-d07f8da49f4e-config\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.207255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2da6bf3-e17d-4adf-96dd-ea097cae192b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.207399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-csi-data-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.207948 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc57139-c6ad-4639-a09f-d07f8da49f4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.208379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-default-certificate\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.208814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-socket-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.209408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0068e5c-7377-479d-9cc5-fd1270c74b33-config\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.209648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-plugins-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.210141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/661df377-ed57-4f75-9be9-3fc5f87cf37e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.210405 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.710391344 +0000 UTC m=+218.168422885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.210586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.211805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2da6bf3-e17d-4adf-96dd-ea097cae192b-proxy-tls\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.211855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a61cce87-0b4f-4886-a347-b98aecad272a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.212326 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-config\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.212470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587836f8-b700-43d0-940e-81d7820b2a6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.212737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-srv-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.213121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-metrics-tls\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.213304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-stats-auth\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-certs\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-node-bootstrap-token\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214349 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/661df377-ed57-4f75-9be9-3fc5f87cf37e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd81e-16ce-4b5c-8667-85c9426a9221-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5611db8a-18df-426e-a6e7-7f6720da4109-tmpfs\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.215126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-images\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.215651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-config-volume\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe2960a0-9218-4d46-8c50-7285c5e27882-proxy-tls\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-webhook-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0068e5c-7377-479d-9cc5-fd1270c74b33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.217279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-metrics-certs\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.217628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428476fd-a8f2-4ffc-bda6-f19da80778ac-cert\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.217816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-serving-cert\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.221109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-key\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.221752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.222142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-apiservice-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.224212 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dbe21c7_d209_4259_b51d_b486b741e9c7.slice/crio-66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186 WatchSource:0}: Error finding container 66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186: Status 404 returned error can't find the container with id 66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.224292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587836f8-b700-43d0-940e-81d7820b2a6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.224621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-srv-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.224889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.225726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.228651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.231111 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.234162 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.246185 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.262328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmx2\" (UniqueName: \"kubernetes.io/projected/5c603995-8326-4bea-892a-74ee1e8c8dea-kube-api-access-bzmx2\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.279848 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89cb82f_a141_419f_bf33_93c219c84e51.slice/crio-b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236 WatchSource:0}: Error finding container b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236: Status 404 returned error can't find the container with id b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.280348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr9b7\" (UniqueName: \"kubernetes.io/projected/109e018a-9ad5-40e6-bd49-07d49d718161-kube-api-access-cr9b7\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.281075 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.288153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.289976 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.295556 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mhf6\" (UniqueName: \"kubernetes.io/projected/7074cf98-12f4-4a73-ad96-4959f64398a7-kube-api-access-5mhf6\") pod \"downloads-7954f5f757-xzx7n\" (UID: \"7074cf98-12f4-4a73-ad96-4959f64398a7\") " pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.303922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.304354 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.804341096 +0000 UTC m=+218.262372627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.310193 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.319325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2669j\" (UniqueName: \"kubernetes.io/projected/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-kube-api-access-2669j\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.340991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.366058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427xq\" (UniqueName: \"kubernetes.io/projected/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-kube-api-access-427xq\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.383552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.389650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" event={"ID":"bfa1fcc6-c9f8-4928-8a95-6c418323dd69","Type":"ContainerStarted","Data":"86af0d43185f7143b9a376dbde8fe9646e5d2b9422bd0f4998a07d587413a525"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.400982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" event={"ID":"0415738e-f327-433a-9a28-0a991138e021","Type":"ContainerStarted","Data":"f1f5ee537c22d825625f686e76d46e48805d3ce0195d22ee7f088e9d51e0315f"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.405929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.406109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlnwq\" (UniqueName: \"kubernetes.io/projected/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-kube-api-access-qlnwq\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.406496 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.906475582 +0000 UTC m=+218.364507123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.420849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" event={"ID":"9f31b9ac-9447-4b20-ac60-7532edfa4600","Type":"ContainerStarted","Data":"bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.422998 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xln6k\" (UniqueName: \"kubernetes.io/projected/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-kube-api-access-xln6k\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.428186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerStarted","Data":"c7f65d1274bb19079f9f79351a782d7495541a1ecdc8d88a866af54812721807"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.436875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerStarted","Data":"df19658d353c1fec007be00d0c32bf3d267bac688d1e2b796e081a83b42f3010"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.441828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.441834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerStarted","Data":"8dc54184aa3235d402105c8c10b4efe9d346fbed9538a68e181e2046d4944844"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.442284 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.443593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" event={"ID":"c89cb82f-a141-419f-bf33-93c219c84e51","Type":"ContainerStarted","Data":"b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.443908 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nww6d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.443932 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.444657 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" event={"ID":"e6f292b8-878f-418e-8c85-2f7818e9dba1","Type":"ContainerStarted","Data":"506917fa822eba8eeccd36c6e28b51af1e4e1e68c026c93680ba389b83e3ae85"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.482704 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbc5q\" (UniqueName: \"kubernetes.io/projected/ff31d5af-4eae-43e7-8512-c6f5d54501e1-kube-api-access-lbc5q\") pod \"migrator-59844c95c7-q9zkh\" (UID: \"ff31d5af-4eae-43e7-8512-c6f5d54501e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.482785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7dc57139-c6ad-4639-a09f-d07f8da49f4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.483165 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.483762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.484994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" event={"ID":"2dbe21c7-d209-4259-b51d-b486b741e9c7","Type":"ContainerStarted","Data":"66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.504887 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bphg\" (UniqueName: \"kubernetes.io/projected/489757b2-0de4-4275-8931-daa5c3b4a75a-kube-api-access-2bphg\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.506738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.507528 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.007232568 +0000 UTC m=+218.465264099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.507911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.508360 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.008347329 +0000 UTC m=+218.466378870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.510114 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.519774 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.520423 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.528959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgxw\" (UniqueName: \"kubernetes.io/projected/d2da6bf3-e17d-4adf-96dd-ea097cae192b-kube-api-access-tmgxw\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.540395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsz6b\" (UniqueName: \"kubernetes.io/projected/ae2bd81e-16ce-4b5c-8667-85c9426a9221-kube-api-access-gsz6b\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.572784 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7c4k\" (UniqueName: \"kubernetes.io/projected/067aa008-8dda-4bfe-bfd2-388abdb54299-kube-api-access-h7c4k\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.591265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fj9p\" (UniqueName: \"kubernetes.io/projected/a61cce87-0b4f-4886-a347-b98aecad272a-kube-api-access-8fj9p\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.598277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfz4k\" (UniqueName: \"kubernetes.io/projected/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-kube-api-access-jfz4k\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.609375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.612485 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.112457397 +0000 UTC m=+218.570489038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.614484 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.622959 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-45pjp"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.623637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.627489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbs2f\" (UniqueName: \"kubernetes.io/projected/5611db8a-18df-426e-a6e7-7f6720da4109-kube-api-access-nbs2f\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.638298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.648033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.659282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9cd4\" (UniqueName: \"kubernetes.io/projected/fe2960a0-9218-4d46-8c50-7285c5e27882-kube-api-access-m9cd4\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.660446 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.662419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pxd\" (UniqueName: \"kubernetes.io/projected/428476fd-a8f2-4ffc-bda6-f19da80778ac-kube-api-access-c2pxd\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.663447 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod109e018a_9ad5_40e6_bd49_07d49d718161.slice/crio-b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521 WatchSource:0}: Error finding container b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521: Status 404 returned error can't find the container with id b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.687893 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.693928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.701521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg7w2\" (UniqueName: \"kubernetes.io/projected/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-kube-api-access-mg7w2\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.701625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jpd\" (UniqueName: \"kubernetes.io/projected/b4698fe3-a607-4978-bad4-5b83d3beb21b-kube-api-access-k2jpd\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.706253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.710392 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrcc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.711479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.711897 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.211882215 +0000 UTC m=+218.669913756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.719660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.728067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.735263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqj4\" (UniqueName: \"kubernetes.io/projected/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-kube-api-access-wvqj4\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.740107 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.740334 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e80a44_9bdc_4321_9536_8eba4527f181.slice/crio-e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b WatchSource:0}: Error finding container e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b: Status 404 returned error can't find the container with id e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.745661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0068e5c-7377-479d-9cc5-fd1270c74b33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.749079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.753677 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.763667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbkf\" (UniqueName: \"kubernetes.io/projected/7479d10c-1c3b-497e-8dda-07cd22aeccf0-kube-api-access-qtbkf\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.777332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jmdf\" (UniqueName: \"kubernetes.io/projected/661df377-ed57-4f75-9be9-3fc5f87cf37e-kube-api-access-4jmdf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.778829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.797341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.802143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.809762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587836f8-b700-43d0-940e-81d7820b2a6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.812223 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.812648 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.312560829 +0000 UTC m=+218.770592380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.815789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.816284 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.316269272 +0000 UTC m=+218.774300813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.821005 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.825721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.830037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"auto-csr-approver-29567120-j7789\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.840376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.878408 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.880449 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xzx7n"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.917557 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.917720 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.417696143 +0000 UTC m=+218.875727684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.917841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.918138 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.418126279 +0000 UTC m=+218.876157810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.931803 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.948260 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7074cf98_12f4_4a73_ad96_4959f64398a7.slice/crio-5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73 WatchSource:0}: Error finding container 5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73: Status 404 returned error can't find the container with id 5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.953225 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.958143 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c2ae844_da0b_45fd_b9f6_e780f4c5e3c0.slice/crio-370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55 WatchSource:0}: Error finding container 370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55: Status 404 returned error can't find the container with id 370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.010827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.019836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.020331 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.520298856 +0000 UTC m=+218.978330397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.038951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.055478 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.063850 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.073894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.095137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.121876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.122157 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.622146512 +0000 UTC m=+219.080178053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.138768 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5l8ml"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.161629 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.166981 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97wlq"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.222767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.223145 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.723129457 +0000 UTC m=+219.181160998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.325442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.325849 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.825833434 +0000 UTC m=+219.283864975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.335877 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gpx9r"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.368962 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.426349 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.426607 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.92658372 +0000 UTC m=+219.384615261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.517378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerStarted","Data":"54c16e287e6b044067d81a5f122f5fce8bd8b850064a731beff318d152b5a0e9"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.532015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.534999 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.034983882 +0000 UTC m=+219.493015423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.556300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" event={"ID":"109e018a-9ad5-40e6-bd49-07d49d718161","Type":"ContainerStarted","Data":"b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.559313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" event={"ID":"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc","Type":"ContainerStarted","Data":"18af66a2ad92bf329fba2a2f1838a1cccadcb24d62e75458f9f37c1f2a373e33"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.581490 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" event={"ID":"c89cb82f-a141-419f-bf33-93c219c84e51","Type":"ContainerStarted","Data":"180c6dc025013638cc807fdf1c845f8d6cda91b8b6f98a2afe0514c07d4b9530"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.581534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" event={"ID":"c89cb82f-a141-419f-bf33-93c219c84e51","Type":"ContainerStarted","Data":"8579508d0e7b50d30b4b5e7989c16cb3aeefdcdf0d60a69b20c5745b4c3dc9da"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.583886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerStarted","Data":"f1055e0f897eb3de575665d60219e71c19392e8e5d91a217f1cf463d9d4c9c60"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.586030 4795 generic.go:334] "Generic (PLEG): container finished" podID="5c21571e-5513-46e0-9eed-4ec64df8e445" containerID="c9ed998f7b281a48e48683b77903f0a206a9d4a28e8a2a53e079b69fdbc3b983" exitCode=0 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.586092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerDied","Data":"c9ed998f7b281a48e48683b77903f0a206a9d4a28e8a2a53e079b69fdbc3b983"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.587270 4795 generic.go:334] "Generic (PLEG): container finished" podID="0415738e-f327-433a-9a28-0a991138e021" containerID="f68eb51d5c9397851fe5e4cd3a21388a07f0fc3771864a526fae4e81b30aca40" exitCode=0 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.587323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" event={"ID":"0415738e-f327-433a-9a28-0a991138e021","Type":"ContainerDied","Data":"f68eb51d5c9397851fe5e4cd3a21388a07f0fc3771864a526fae4e81b30aca40"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.588180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" event={"ID":"e6f292b8-878f-418e-8c85-2f7818e9dba1","Type":"ContainerStarted","Data":"7e929028c9976aa944a24cd775ab21d67ac03300461edb7f6203898cf7a1e075"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.592975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerStarted","Data":"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.593717 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nww6d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.593761 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.593974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" event={"ID":"e5e80a44-9bdc-4321-9536-8eba4527f181","Type":"ContainerStarted","Data":"e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.595479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lrxrs" event={"ID":"067aa008-8dda-4bfe-bfd2-388abdb54299","Type":"ContainerStarted","Data":"4e4517e84f9b723ec8f24685f10c6bbee574b560c64eb3923fca706613086039"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.598252 4795 generic.go:334] "Generic (PLEG): container finished" podID="2dbe21c7-d209-4259-b51d-b486b741e9c7" containerID="1733cd22fbd27185865760c89a7ab5cc50cefba540a0433cda64c9f1148b7ed5" exitCode=0 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.598320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" event={"ID":"2dbe21c7-d209-4259-b51d-b486b741e9c7","Type":"ContainerDied","Data":"1733cd22fbd27185865760c89a7ab5cc50cefba540a0433cda64c9f1148b7ed5"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.601839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" event={"ID":"9f31b9ac-9447-4b20-ac60-7532edfa4600","Type":"ContainerStarted","Data":"2048fcd6da728d0a278446bafbcdb77ecde924f134d2b1a44b72a125b1721e3c"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.601890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" event={"ID":"9f31b9ac-9447-4b20-ac60-7532edfa4600","Type":"ContainerStarted","Data":"99d36198ddf70b000627cdbdb035489a54a189c71817814eeed4f8b69dc14386"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.608655 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" event={"ID":"bfa1fcc6-c9f8-4928-8a95-6c418323dd69","Type":"ContainerStarted","Data":"fa028b0d15db3d54fb873ef78829dfec8396a21c64139d721d4aa98c65b9c0a6"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.608711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" event={"ID":"bfa1fcc6-c9f8-4928-8a95-6c418323dd69","Type":"ContainerStarted","Data":"91f71ce7f513a7937de856fd70108bad56de6a942f3aad8f88712b5ae1b6e03d"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.617523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerStarted","Data":"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.618560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xzx7n" event={"ID":"7074cf98-12f4-4a73-ad96-4959f64398a7","Type":"ContainerStarted","Data":"5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.625628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" event={"ID":"0589e639-75bf-4a26-a80b-dbb69a6c9955","Type":"ContainerStarted","Data":"76a82a643089fa9d447aac978b5c6e67418fb24ac358cc2b5b2b550cdda222c3"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.627392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-97wlq" event={"ID":"5c603995-8326-4bea-892a-74ee1e8c8dea","Type":"ContainerStarted","Data":"6c46a4db9827b197f840f3213bab1ae85a40953d78cbd3db3e409d8746c0b8fa"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.628041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" event={"ID":"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db","Type":"ContainerStarted","Data":"e223dc095c0838fe7665fd014cc949fa9ca5d604adfd1ffc0cadf593e6b69e7f"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.628555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" event={"ID":"32dc8fa2-0199-444e-9983-4af0fb9172b1","Type":"ContainerStarted","Data":"f781d3f94d993cd68303fcd18451ac14ed765b0b116e4ee7ddb6556d28d030fa"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.629054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8v58t" event={"ID":"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0","Type":"ContainerStarted","Data":"370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.637228 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.637995 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.137974758 +0000 UTC m=+219.596006289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.744048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.754591 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.254562516 +0000 UTC m=+219.712594057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.846631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.847279 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.347238211 +0000 UTC m=+219.805269762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.847938 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.848381 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.348369402 +0000 UTC m=+219.806400943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.903873 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.923685 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.952269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.952802 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.45277047 +0000 UTC m=+219.910802011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.054368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.054765 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.554752471 +0000 UTC m=+220.012784012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: W0320 17:21:16.090895 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd9b8a97_1b9d_4365_a985_a02d4078e3c2.slice/crio-e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681 WatchSource:0}: Error finding container e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681: Status 404 returned error can't find the container with id e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681 Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.154860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.155363 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.655349832 +0000 UTC m=+220.113381373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.224256 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podStartSLOduration=163.224234541 podStartE2EDuration="2m43.224234541s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.216352077 +0000 UTC m=+219.674383618" watchObservedRunningTime="2026-03-20 17:21:16.224234541 +0000 UTC m=+219.682266082" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.256779 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.257136 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.757116614 +0000 UTC m=+220.215148155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.264194 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdx4t"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.286570 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.294647 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.314049 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-454wp"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.316772 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5v822"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.358211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.358582 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.858566456 +0000 UTC m=+220.316597997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.407773 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.437316 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.483985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.484340 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.984315382 +0000 UTC m=+220.442346923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.528304 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.529661 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.538140 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.538525 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" podStartSLOduration=163.538509733 podStartE2EDuration="2m43.538509733s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.471770191 +0000 UTC m=+219.929801742" watchObservedRunningTime="2026-03-20 17:21:16.538509733 +0000 UTC m=+219.996541274" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.543812 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" podStartSLOduration=163.543794123 podStartE2EDuration="2m43.543794123s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.526815012 +0000 UTC m=+219.984846553" watchObservedRunningTime="2026-03-20 17:21:16.543794123 +0000 UTC m=+220.001825664" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.556863 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.559328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c49vv"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.586510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.586815 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.08680075 +0000 UTC m=+220.544832291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.594422 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.597755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.601350 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgsw2"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.606638 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.632614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.656075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lrxrs" event={"ID":"067aa008-8dda-4bfe-bfd2-388abdb54299","Type":"ContainerStarted","Data":"5074947363c8e078a6e0c94b853247cc510b7024fc9ef716b4a077bf949eef30"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.679039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hn4r8" podStartSLOduration=163.679025601 podStartE2EDuration="2m43.679025601s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.678355677 +0000 UTC m=+220.136387228" watchObservedRunningTime="2026-03-20 17:21:16.679025601 +0000 UTC m=+220.137057142" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.693849 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.694168 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.194157915 +0000 UTC m=+220.652189456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.713379 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" podStartSLOduration=163.713365666 podStartE2EDuration="2m43.713365666s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.712258587 +0000 UTC m=+220.170290138" watchObservedRunningTime="2026-03-20 17:21:16.713365666 +0000 UTC m=+220.171397207" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.743904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdx4t" event={"ID":"428476fd-a8f2-4ffc-bda6-f19da80778ac","Type":"ContainerStarted","Data":"a2ec43531ee102b81c16e77f9e33cd1c8bfc557b2e21e312edb7e2216cbd8413"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.774966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" event={"ID":"d2da6bf3-e17d-4adf-96dd-ea097cae192b","Type":"ContainerStarted","Data":"85d0bfe7d4eb237c6bde932b0f7cbca57c7cee54530fd3158bc9526fe9da9bfd"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.775127 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.795759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.797328 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.297311868 +0000 UTC m=+220.755343409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.802216 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" podStartSLOduration=163.802196874 podStartE2EDuration="2m43.802196874s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.797189363 +0000 UTC m=+220.255220904" watchObservedRunningTime="2026-03-20 17:21:16.802196874 +0000 UTC m=+220.260228415" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.830577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" event={"ID":"0415738e-f327-433a-9a28-0a991138e021","Type":"ContainerStarted","Data":"b4ae167299192c5b7ea1edc08b3d4054f88f12c7bc58610329352460a168f0c8"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.849869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" event={"ID":"b4698fe3-a607-4978-bad4-5b83d3beb21b","Type":"ContainerStarted","Data":"844da625d6cd3258eb757238232b8cab634b2d2b87eb5b6a81bc94c3ddac17f8"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.873340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5v822" event={"ID":"ee6682f1-2148-45a9-ac41-aeb6fddbabb4","Type":"ContainerStarted","Data":"31266e8030183d9f823c01a5d24f31a879c15602d69fc31d02394e66ea006052"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.899396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.901463 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.401447716 +0000 UTC m=+220.859479257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.910803 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerStarted","Data":"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.913537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.912966 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lrxrs" podStartSLOduration=163.91295042 podStartE2EDuration="2m43.91295042s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.910888076 +0000 UTC m=+220.368919617" watchObservedRunningTime="2026-03-20 17:21:16.91295042 +0000 UTC m=+220.370981961" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.916801 4795 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mmtf7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" start-of-body= Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.916856 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.940836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerStarted","Data":"552c19deec1ee9883f89b895bd5a9ae748bbcb5e7537b45d9e966f6c5f189edb"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.944545 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" podStartSLOduration=163.944526097 podStartE2EDuration="2m43.944526097s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.937981531 +0000 UTC m=+220.396013072" watchObservedRunningTime="2026-03-20 17:21:16.944526097 +0000 UTC m=+220.402557638" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.975151 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerStarted","Data":"28f31a930970e41b0cef9f1365f8642907101991a92f9c69e9321574b4fa02fb"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.985236 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.000967 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45976: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.035389 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.037659 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" podStartSLOduration=164.037644128 podStartE2EDuration="2m44.037644128s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.991232378 +0000 UTC m=+220.449263909" watchObservedRunningTime="2026-03-20 17:21:17.037644128 +0000 UTC m=+220.495675669" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.043994 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.543970706 +0000 UTC m=+221.002002237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.048366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" event={"ID":"5611db8a-18df-426e-a6e7-7f6720da4109","Type":"ContainerStarted","Data":"d1be7ae7a18f045a6ad585588e79f66d1659499c01a7a3968ddeb8104aa71b6c"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.048443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" event={"ID":"5611db8a-18df-426e-a6e7-7f6720da4109","Type":"ContainerStarted","Data":"3c6528db53dd00c4095acc6b155873dcd4c99c1f85eab4c0a64d5c27b2f82a85"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.049885 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.066545 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.066939 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-c2p9f container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.068420 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" podUID="5611db8a-18df-426e-a6e7-7f6720da4109" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.077223 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" event={"ID":"ae2bd81e-16ce-4b5c-8667-85c9426a9221","Type":"ContainerStarted","Data":"656623000fd2b6c8f5c9b771fe74413e198fc2c20adee4fb3f7a29a06a8553ad"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.077291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" event={"ID":"ae2bd81e-16ce-4b5c-8667-85c9426a9221","Type":"ContainerStarted","Data":"0bcf03237f8b723d2e143b3ed0a5ebe114f436dae5359163a75c364d0874c5ca"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.093666 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" podStartSLOduration=164.088873532 podStartE2EDuration="2m44.088873532s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.087353798 +0000 UTC m=+220.545385339" watchObservedRunningTime="2026-03-20 17:21:17.088873532 +0000 UTC m=+220.546905073" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.116082 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" event={"ID":"a61cce87-0b4f-4886-a347-b98aecad272a","Type":"ContainerStarted","Data":"4aeaab9db3e68a9fcefb7e827f62ac75ed7dc12dd4c14a5a1d3770bea71ec19f"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.128017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" event={"ID":"ff31d5af-4eae-43e7-8512-c6f5d54501e1","Type":"ContainerStarted","Data":"cd8eb396cff6031b3c84ff18d9f8b15e72fe7f8c789adb6b655a9f2e39111bce"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.137595 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45990: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.137716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.138117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerStarted","Data":"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.138265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerStarted","Data":"10ac9aefe8ac1466c7fac8993e74ddbafb9c6821332b48f3d05657ff9290f6e5"} Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.138391 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.638376124 +0000 UTC m=+221.096407735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.139068 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.140607 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-clvzs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.140746 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.143784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-97wlq" event={"ID":"5c603995-8326-4bea-892a-74ee1e8c8dea","Type":"ContainerStarted","Data":"8ca2b6dc3b7309e97e3343c108c271295289b09bf2fdad4cda5a15905c7d3f5c"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.143833 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.147972 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-97wlq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.148029 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-97wlq" podUID="5c603995-8326-4bea-892a-74ee1e8c8dea" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.149216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"43dabd2f36a64796afc5bb69ac01cf23091d9567a626a9d8c7bcbcbd63e40a77"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.156546 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" event={"ID":"e5e80a44-9bdc-4321-9536-8eba4527f181","Type":"ContainerStarted","Data":"4f4163c43ad7e99e4fc450a246670ca3e89f4a7cd4e150b2fb7c320ffc818ac6"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.158933 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" event={"ID":"32dc8fa2-0199-444e-9983-4af0fb9172b1","Type":"ContainerStarted","Data":"ed3d9634104e48b3e997a511558964c428b3525c8eb6eea89c978e4afba0014b"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.158974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" event={"ID":"32dc8fa2-0199-444e-9983-4af0fb9172b1","Type":"ContainerStarted","Data":"2302ec3f03a6f6e4554b4902e4e20d7ff92560402c2630351bfbe65e589c53c5"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.161941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" event={"ID":"109e018a-9ad5-40e6-bd49-07d49d718161","Type":"ContainerStarted","Data":"5eb1ff3d6226ec2921a970f4e455445453aaf29559587a532f5a5839e6bf53b6"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.161998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" event={"ID":"109e018a-9ad5-40e6-bd49-07d49d718161","Type":"ContainerStarted","Data":"69cff67a961c3b591feac8bcfe68ebb66c1041603648fbc42cf4c026019a25c9"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.168136 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podStartSLOduration=164.168119065 podStartE2EDuration="2m44.168119065s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.167099568 +0000 UTC m=+220.625131109" watchObservedRunningTime="2026-03-20 17:21:17.168119065 +0000 UTC m=+220.626150626" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.194263 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" podStartSLOduration=164.194244926 podStartE2EDuration="2m44.194244926s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.193310492 +0000 UTC m=+220.651342033" watchObservedRunningTime="2026-03-20 17:21:17.194244926 +0000 UTC m=+220.652276467" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.195701 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45998: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.222138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerStarted","Data":"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.222871 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.231233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xzx7n" event={"ID":"7074cf98-12f4-4a73-ad96-4959f64398a7","Type":"ContainerStarted","Data":"ac28fbf851090a3f20657a132f33c64066426cdda1281462f4b02b532539f53a"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.231784 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.238045 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.238089 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.238602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.239458 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.739444762 +0000 UTC m=+221.197476293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.252574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" event={"ID":"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb","Type":"ContainerStarted","Data":"078e38f1e68993a94da4fa6731635b71bf717db58878df9dddaaff4f76a78a28"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.263719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" event={"ID":"fe2960a0-9218-4d46-8c50-7285c5e27882","Type":"ContainerStarted","Data":"0b9b62e8bcb48fda415edd92c26eb068bd4c60e58a12ac8eaf98d8be4020ad2a"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.272602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" event={"ID":"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db","Type":"ContainerStarted","Data":"2379257154e0054e98ee3a70fccefb094c64b7450dc6992a4ce9411ab4b33bae"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.277647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" event={"ID":"cd9b8a97-1b9d-4365-a985-a02d4078e3c2","Type":"ContainerStarted","Data":"88bf7f225a65d9066735c363a6af63172e793eb14b750c94f9cbb6bebc8f4cf6"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.277720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" event={"ID":"cd9b8a97-1b9d-4365-a985-a02d4078e3c2","Type":"ContainerStarted","Data":"e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.293154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" event={"ID":"2dbe21c7-d209-4259-b51d-b486b741e9c7","Type":"ContainerStarted","Data":"736fcacac3c55fdf35c00fdcfd8aef8acd88fcbbf9c2da52162e023dbf09f8b5"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.293312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.302094 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46004: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.304675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" event={"ID":"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc","Type":"ContainerStarted","Data":"a9db3ae2c35b309dccb2b4aad094b93af423521c890af0d8da9bafb6fc6d1a48"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.312588 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" podStartSLOduration=164.312574944 podStartE2EDuration="2m44.312574944s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.253006921 +0000 UTC m=+220.711038452" watchObservedRunningTime="2026-03-20 17:21:17.312574944 +0000 UTC m=+220.770606485" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.313759 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" podStartSLOduration=164.313752937 podStartE2EDuration="2m44.313752937s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.312102927 +0000 UTC m=+220.770134468" watchObservedRunningTime="2026-03-20 17:21:17.313752937 +0000 UTC m=+220.771784478" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.329160 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" event={"ID":"0589e639-75bf-4a26-a80b-dbb69a6c9955","Type":"ContainerStarted","Data":"16bdca6831351fe8fda25820cdee23586fb9c32f72a6772ec3dae34fdfc35e77"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.339914 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-97wlq" podStartSLOduration=164.339897818 podStartE2EDuration="2m44.339897818s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.339565136 +0000 UTC m=+220.797596677" watchObservedRunningTime="2026-03-20 17:21:17.339897818 +0000 UTC m=+220.797929369" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.340363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.343095 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.843084502 +0000 UTC m=+221.301116043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.351770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8v58t" event={"ID":"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0","Type":"ContainerStarted","Data":"3f7fe5bf0227e70a4d4d0a8926c977f8e860e099cd32663a603e50c22d058872"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.363611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" event={"ID":"661df377-ed57-4f75-9be9-3fc5f87cf37e","Type":"ContainerStarted","Data":"137f21a34253c69141125a51a6740fb94fed150263dd468485ae82a0c92d589e"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.368546 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" event={"ID":"7dc57139-c6ad-4639-a09f-d07f8da49f4e","Type":"ContainerStarted","Data":"e5354570ff761cd4a55de7c9761b88e79e43092472e6aa3a0f82a5da49a6dcb2"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.381451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.412076 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46008: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.412882 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" podStartSLOduration=164.412864574 podStartE2EDuration="2m44.412864574s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.412261283 +0000 UTC m=+220.870292824" watchObservedRunningTime="2026-03-20 17:21:17.412864574 +0000 UTC m=+220.870896115" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.414821 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" podStartSLOduration=164.414814705 podStartE2EDuration="2m44.414814705s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.383608161 +0000 UTC m=+220.841639702" watchObservedRunningTime="2026-03-20 17:21:17.414814705 +0000 UTC m=+220.872846246" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.445133 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.446293 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.946274396 +0000 UTC m=+221.404305937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.450466 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" podStartSLOduration=164.450452917 podStartE2EDuration="2m44.450452917s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.444541954 +0000 UTC m=+220.902573495" watchObservedRunningTime="2026-03-20 17:21:17.450452917 +0000 UTC m=+220.908484458" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.474958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" podStartSLOduration=164.474944139 podStartE2EDuration="2m44.474944139s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.474594326 +0000 UTC m=+220.932625867" watchObservedRunningTime="2026-03-20 17:21:17.474944139 +0000 UTC m=+220.932975680" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.516734 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46012: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.526939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xzx7n" podStartSLOduration=164.52692658 podStartE2EDuration="2m44.52692658s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.499205082 +0000 UTC m=+220.957236623" watchObservedRunningTime="2026-03-20 17:21:17.52692658 +0000 UTC m=+220.984958121" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.547245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.549267 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.049247524 +0000 UTC m=+221.507279065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.569957 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" podStartSLOduration=164.569940448 podStartE2EDuration="2m44.569940448s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.541923819 +0000 UTC m=+220.999955360" watchObservedRunningTime="2026-03-20 17:21:17.569940448 +0000 UTC m=+221.027971989" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.625603 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" podStartSLOduration=164.625581981 podStartE2EDuration="2m44.625581981s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.566282077 +0000 UTC m=+221.024313618" watchObservedRunningTime="2026-03-20 17:21:17.625581981 +0000 UTC m=+221.083613522" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.626940 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46022: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.627932 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.639006 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.649809 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.650819 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.150799198 +0000 UTC m=+221.608830729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.658811 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8v58t" podStartSLOduration=6.658792077 podStartE2EDuration="6.658792077s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.627235801 +0000 UTC m=+221.085267342" watchObservedRunningTime="2026-03-20 17:21:17.658792077 +0000 UTC m=+221.116823618" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.659164 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:17 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:17 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:17 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.659207 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.755191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.755671 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.255639172 +0000 UTC m=+221.713670713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.795673 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46028: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.858566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.858950 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.3589377 +0000 UTC m=+221.816969241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.960470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.960873 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.460859859 +0000 UTC m=+221.918891400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.061217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.061846 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.561829864 +0000 UTC m=+222.019861405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.171360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.171744 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.671727739 +0000 UTC m=+222.129759280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.272146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.272450 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.772435853 +0000 UTC m=+222.230467394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.376215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.376501 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.876487169 +0000 UTC m=+222.334518710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.396764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5v822" event={"ID":"ee6682f1-2148-45a9-ac41-aeb6fddbabb4","Type":"ContainerStarted","Data":"8ec532c8079cd62d4c26a9733e32c1b12deb61ee64035e6e8fa18edbb30cca2f"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.474209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerStarted","Data":"224d79e5dc4ec71d76223795bb5cbedf4513345b2722860bb4471a4dc08f027f"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.477210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.477508 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.977490094 +0000 UTC m=+222.435521635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.488530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" event={"ID":"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb","Type":"ContainerStarted","Data":"12b36845a5a7d228e3e748fe6a0fda2bb004ced1ad6fb38879b59fe5fb0b0c33"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.505878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" event={"ID":"fe2960a0-9218-4d46-8c50-7285c5e27882","Type":"ContainerStarted","Data":"7147945169d288d4147169b27adce639e5218dbe624dadf2e19de8e5cd1f7b88"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.512492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdx4t" event={"ID":"428476fd-a8f2-4ffc-bda6-f19da80778ac","Type":"ContainerStarted","Data":"bb7f33248b0e1fb6f93959335725dcf46d2ff4ab075cdb3f56b0e0fe7a1bf85a"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.521527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" event={"ID":"ff31d5af-4eae-43e7-8512-c6f5d54501e1","Type":"ContainerStarted","Data":"05423c9183d5036ef1a1c4ae8a8e678057166b475a68c1fbe1b7635f827ee892"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.528591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerStarted","Data":"1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.537440 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46030: no serving certificate available for the kubelet" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.546666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" event={"ID":"b4698fe3-a607-4978-bad4-5b83d3beb21b","Type":"ContainerStarted","Data":"1e232a99c6bb86fa55b55610d3a0d5aea787c9dd57ea0519637c851d1b2fa337"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.546982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.548045 4795 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nsf5t container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.548081 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" podUID="b4698fe3-a607-4978-bad4-5b83d3beb21b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.569859 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" event={"ID":"d2da6bf3-e17d-4adf-96dd-ea097cae192b","Type":"ContainerStarted","Data":"94fe864c74af53577f10e39d8b02a6f9d851e4064fdc194056dfbdf5c473e26b"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.570575 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" podStartSLOduration=165.570532564 podStartE2EDuration="2m45.570532564s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.537510705 +0000 UTC m=+221.995542246" watchObservedRunningTime="2026-03-20 17:21:18.570532564 +0000 UTC m=+222.028564105" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.573351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" event={"ID":"d0068e5c-7377-479d-9cc5-fd1270c74b33","Type":"ContainerStarted","Data":"a9ba1ae8884b90fea3faf4a17e7dc6dc873d005658b2a2b343bca6dd728d748c"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.578570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.579965 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.079951133 +0000 UTC m=+222.537982674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.580676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" event={"ID":"a61cce87-0b4f-4886-a347-b98aecad272a","Type":"ContainerStarted","Data":"47ba2a6c1eab62bab6e402e6b9a48c862df366bee01594b1a8522c5fb95bbd91"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.580727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" event={"ID":"a61cce87-0b4f-4886-a347-b98aecad272a","Type":"ContainerStarted","Data":"ef1bd3cd1a6f7f35a53192afe65bc066ef3ea860125dcb3c070783cdd45e1106"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.581135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.590405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" event={"ID":"d1a9c8a4-d7c9-4365-8516-465b89c76ea8","Type":"ContainerStarted","Data":"064ccc2cb0d574f51c8f36bda50c4b3467372412b341b19c96ca0545c2cb6b83"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.590721 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.596840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" event={"ID":"587836f8-b700-43d0-940e-81d7820b2a6b","Type":"ContainerStarted","Data":"d9ccefe373a7c66ab0fc517c77194971cbd3f11fab9855fec9dece4ab1ffb741"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.598943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" event={"ID":"ae2bd81e-16ce-4b5c-8667-85c9426a9221","Type":"ContainerStarted","Data":"6bfcd69fde40a5a7065de6480195d1ea02ec281fd4ccf0bf301f7a53857fc995"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.605305 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wwjgw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.605342 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" podUID="d1a9c8a4-d7c9-4365-8516-465b89c76ea8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.608798 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" podStartSLOduration=165.60878261 podStartE2EDuration="2m45.60878261s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.608207639 +0000 UTC m=+222.066239180" watchObservedRunningTime="2026-03-20 17:21:18.60878261 +0000 UTC m=+222.066814141" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.609612 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" podStartSLOduration=165.60960625 podStartE2EDuration="2m45.60960625s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.587580037 +0000 UTC m=+222.045611578" watchObservedRunningTime="2026-03-20 17:21:18.60960625 +0000 UTC m=+222.067637791" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.620529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" event={"ID":"7479d10c-1c3b-497e-8dda-07cd22aeccf0","Type":"ContainerStarted","Data":"8b6642f8b3cb955eddfd4a9ff9e184d0f2736818200c0c1b8f9d9e4799c51d01"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.625666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-j7789" event={"ID":"bed1d31b-b060-45c3-95bf-3b226a36efe1","Type":"ContainerStarted","Data":"dd9d5f9731ec60032210cdc180eb41d5f236e29f6e6729daa332365615c09023"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.625826 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" containerID="cri-o://2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" gracePeriod=30 Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.627238 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-clvzs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.627269 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.628361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gdx4t" podStartSLOduration=7.628341644 podStartE2EDuration="7.628341644s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.627080829 +0000 UTC m=+222.085112370" watchObservedRunningTime="2026-03-20 17:21:18.628341644 +0000 UTC m=+222.086373185" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.629260 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" containerID="cri-o://113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" gracePeriod=30 Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.632352 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.632406 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.655850 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.657026 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.659994 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:18 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:18 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:18 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.660047 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.682735 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" podStartSLOduration=165.682718812 podStartE2EDuration="2m45.682718812s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.68185816 +0000 UTC m=+222.139889701" watchObservedRunningTime="2026-03-20 17:21:18.682718812 +0000 UTC m=+222.140750353" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.683980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.685594 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.185566034 +0000 UTC m=+222.643597575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.741981 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.742506 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.744904 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" podStartSLOduration=165.74488996 podStartE2EDuration="2m45.74488996s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.700623086 +0000 UTC m=+222.158654627" watchObservedRunningTime="2026-03-20 17:21:18.74488996 +0000 UTC m=+222.202921501" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.745103 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" podStartSLOduration=165.745099307 podStartE2EDuration="2m45.745099307s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.742179392 +0000 UTC m=+222.200210933" watchObservedRunningTime="2026-03-20 17:21:18.745099307 +0000 UTC m=+222.203130848" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.774475 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" podStartSLOduration=165.774451183 podStartE2EDuration="2m45.774451183s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.773637784 +0000 UTC m=+222.231669325" watchObservedRunningTime="2026-03-20 17:21:18.774451183 +0000 UTC m=+222.232482724" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.786942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.802326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" podStartSLOduration=165.802304026 podStartE2EDuration="2m45.802304026s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.791415904 +0000 UTC m=+222.249447445" watchObservedRunningTime="2026-03-20 17:21:18.802304026 +0000 UTC m=+222.260335567" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.818731 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.318716076 +0000 UTC m=+222.776747617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.821144 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" podStartSLOduration=165.821122153 podStartE2EDuration="2m45.821122153s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.817474372 +0000 UTC m=+222.275505913" watchObservedRunningTime="2026-03-20 17:21:18.821122153 +0000 UTC m=+222.279153694" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.891191 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.891516 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.391500926 +0000 UTC m=+222.849532467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.899112 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" podStartSLOduration=165.89909776 podStartE2EDuration="2m45.89909776s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.864253796 +0000 UTC m=+222.322285337" watchObservedRunningTime="2026-03-20 17:21:18.89909776 +0000 UTC m=+222.357129301" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.996753 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.997297 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.497286484 +0000 UTC m=+222.955318025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.013814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.013864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.035848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.100057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.100370 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.600354904 +0000 UTC m=+223.058386445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.203262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.203609 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.70359823 +0000 UTC m=+223.161629771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.217774 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.270577 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.270843 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.270857 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.270959 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.271330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.296735 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305426 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.307342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.307424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config" (OuterVolumeSpecName: "config") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.307519 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.8075036 +0000 UTC m=+223.265535141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.317367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.320034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm" (OuterVolumeSpecName: "kube-api-access-hpnfm") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "kube-api-access-hpnfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.389650 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.402102 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.406989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407186 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407196 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407204 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407213 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.407443 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.907433267 +0000 UTC m=+223.365464808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508485 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.508873 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.008848428 +0000 UTC m=+223.466879969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca" (OuterVolumeSpecName: "client-ca") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509226 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config" (OuterVolumeSpecName: "config") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509308 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509348 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.509367 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.009354716 +0000 UTC m=+223.467386257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509435 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509454 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.510344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.510617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.522035 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn" (OuterVolumeSpecName: "kube-api-access-g5lkn") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "kube-api-access-g5lkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.523859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.525236 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.538303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.611464 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.11143826 +0000 UTC m=+223.569469801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611759 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611793 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611802 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.612025 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.112013501 +0000 UTC m=+223.570045042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.638198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" event={"ID":"d1a9c8a4-d7c9-4365-8516-465b89c76ea8","Type":"ContainerStarted","Data":"99be79088ddfd7f80669aea1143f116e4bd5db832adf0b396f6b68c444c9f3fa"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.639151 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wwjgw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.639178 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" podUID="d1a9c8a4-d7c9-4365-8516-465b89c76ea8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641001 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b31867d-2f52-4f4c-943a-9431cb585027" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" exitCode=0 Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerDied","Data":"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641052 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerDied","Data":"f1055e0f897eb3de575665d60219e71c19392e8e5d91a217f1cf463d9d4c9c60"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641066 4795 scope.go:117] "RemoveContainer" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.648463 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:19 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:19 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:19 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.648498 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.654908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" event={"ID":"ff31d5af-4eae-43e7-8512-c6f5d54501e1","Type":"ContainerStarted","Data":"c8a620a751d9e8694b7daa37279282b0fbf7008de6092ba237774e8743691e9d"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.670916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" event={"ID":"661df377-ed57-4f75-9be9-3fc5f87cf37e","Type":"ContainerStarted","Data":"4ba37ac78d373afad371c1ff6afb94705e84340687cde353bc19394d5ef469a9"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.671527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.687178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" event={"ID":"7dc57139-c6ad-4639-a09f-d07f8da49f4e","Type":"ContainerStarted","Data":"a437a01737b0fc2abcb65b81e5839d8a14fb18d1f8280765b38578848b11cbf4"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.688516 4795 scope.go:117] "RemoveContainer" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.697931 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" podStartSLOduration=166.697916422 podStartE2EDuration="2m46.697916422s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.677745476 +0000 UTC m=+223.135777017" watchObservedRunningTime="2026-03-20 17:21:19.697916422 +0000 UTC m=+223.155947963" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.699131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.701769 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.713515 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.714510 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.214489949 +0000 UTC m=+223.672521490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.717897 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de\": container with ID starting with 113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de not found: ID does not exist" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.717949 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de"} err="failed to get container status \"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de\": rpc error: code = NotFound desc = could not find container \"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de\": container with ID starting with 113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de not found: ID does not exist" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728454 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" exitCode=0 Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerDied","Data":"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerDied","Data":"8dc54184aa3235d402105c8c10b4efe9d346fbed9538a68e181e2046d4944844"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728597 4795 scope.go:117] "RemoveContainer" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728714 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.740993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" event={"ID":"7479d10c-1c3b-497e-8dda-07cd22aeccf0","Type":"ContainerStarted","Data":"8b647c9a4a8522117ddf86614575a4051c8a80c1f8a50361724bf5772af1c8e4"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.758746 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" podStartSLOduration=166.758729091 podStartE2EDuration="2m46.758729091s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.758194723 +0000 UTC m=+223.216226254" watchObservedRunningTime="2026-03-20 17:21:19.758729091 +0000 UTC m=+223.216760632" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.760901 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" podStartSLOduration=166.76089195 podStartE2EDuration="2m46.76089195s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.732973035 +0000 UTC m=+223.191004576" watchObservedRunningTime="2026-03-20 17:21:19.76089195 +0000 UTC m=+223.218923491" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.769816 4795 scope.go:117] "RemoveContainer" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.770201 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5v822" event={"ID":"ee6682f1-2148-45a9-ac41-aeb6fddbabb4","Type":"ContainerStarted","Data":"1c7765a0d8468d27ef8c9fafd096ec42e11be0366fc85bb4abb6ca8b628ebadc"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.770347 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.772510 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.774234 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58\": container with ID starting with 2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58 not found: ID does not exist" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.774299 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58"} err="failed to get container status \"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58\": rpc error: code = NotFound desc = could not find container \"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58\": container with ID starting with 2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58 not found: ID does not exist" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.778837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"254ef3aa6a62fa55712408eb37e7db7600f3e1dcb0583b6377071776549c08f7"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.781905 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.788907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" event={"ID":"d2da6bf3-e17d-4adf-96dd-ea097cae192b","Type":"ContainerStarted","Data":"65f8d4a1b294283aea2b3503643700af7592aef35b45b1492d252b985db0e342"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.811750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" event={"ID":"587836f8-b700-43d0-940e-81d7820b2a6b","Type":"ContainerStarted","Data":"0bc6665630ff72f21309e606f4ce11e1137f70eb53ab17fe54ad2f4b5682fcc8"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.815720 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5v822" podStartSLOduration=8.815709602 podStartE2EDuration="8.815709602s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.812257868 +0000 UTC m=+223.270289409" watchObservedRunningTime="2026-03-20 17:21:19.815709602 +0000 UTC m=+223.273741143" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.834135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" event={"ID":"fe2960a0-9218-4d46-8c50-7285c5e27882","Type":"ContainerStarted","Data":"006935bafc32330ec5f80ed2d0a4cb0d7be3eba372e67f9e4f9eb21a52e28ab6"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.841319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.843052 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.343039016 +0000 UTC m=+223.801070557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.846020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" event={"ID":"d0068e5c-7377-479d-9cc5-fd1270c74b33","Type":"ContainerStarted","Data":"9a18b4cf9d78c62f682f6919ed686ea00222f8e54f354caca0d82613fc23a31f"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.855761 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.855820 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.856822 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.862273 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.870762 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46044: no serving certificate available for the kubelet" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.917197 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" podStartSLOduration=166.917178665 podStartE2EDuration="2m46.917178665s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.866083966 +0000 UTC m=+223.324115507" watchObservedRunningTime="2026-03-20 17:21:19.917178665 +0000 UTC m=+223.375210206" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.947184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.948632 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.448611437 +0000 UTC m=+223.906642978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.991327 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" podStartSLOduration=166.991309693 podStartE2EDuration="2m46.991309693s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.991082445 +0000 UTC m=+223.449113986" watchObservedRunningTime="2026-03-20 17:21:19.991309693 +0000 UTC m=+223.449341234" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.007984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.049633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.049971 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.549959514 +0000 UTC m=+224.007991055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.104389 4795 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bl2bp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]log ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]etcd ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/max-in-flight-filter ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 17:21:20 crc kubenswrapper[4795]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 17:21:20 crc kubenswrapper[4795]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 17:21:20 crc kubenswrapper[4795]: livez check failed Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.104449 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" podUID="5c21571e-5513-46e0-9eed-4ec64df8e445" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.151166 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.152283 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.652258016 +0000 UTC m=+224.110289557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.220566 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.256510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.257650 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.757635489 +0000 UTC m=+224.215667030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.357362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.357742 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.857714782 +0000 UTC m=+224.315746333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.462260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.462625 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.962614187 +0000 UTC m=+224.420645728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.563644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.563922 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.063895253 +0000 UTC m=+224.521926794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.564060 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.564338 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.064327019 +0000 UTC m=+224.522358560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.643069 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:20 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.643196 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.664682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.664820 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.164794725 +0000 UTC m=+224.622826266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.664956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.665257 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.165241731 +0000 UTC m=+224.623273272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.765678 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.765823 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.26579853 +0000 UTC m=+224.723830061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.766095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.766407 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.266398342 +0000 UTC m=+224.724429883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.780478 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.780749 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.780767 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.780876 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.781781 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.786126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.796555 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.806854 4795 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.863660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"e2a08c6c0616c88e3639564a54bd192554c1ae0d5f874d76d44021f4ba46ce86"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.863721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"0813db22d2e9f2c49d4516839f56a75d5403aaebf57bb7fec6cb3b21729f3ef8"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.866592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerStarted","Data":"fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.866668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerStarted","Data":"6092fd1860a57c1ca6f62820e564b6ec08d02b8c7829faf4d750183f7837476f"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.866832 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.867128 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.367108147 +0000 UTC m=+224.825139698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.867676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.868034 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.368019159 +0000 UTC m=+224.826050700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.877509 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.892708 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" podStartSLOduration=3.892677576 podStartE2EDuration="3.892677576s" podCreationTimestamp="2026-03-20 17:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:20.889926888 +0000 UTC m=+224.347958429" watchObservedRunningTime="2026-03-20 17:21:20.892677576 +0000 UTC m=+224.350709117" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.969155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.969844 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.970166 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.970221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.970311 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.470297421 +0000 UTC m=+224.928328962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.983191 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.985125 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.987935 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.004483 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.072601 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.572590333 +0000 UTC m=+225.030621874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.073457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.073520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.094435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.103711 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.173217 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.673190064 +0000 UTC m=+225.131221605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173331 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.173868 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.673861668 +0000 UTC m=+225.131893209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.175987 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.176941 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.190249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.198863 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.259037 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" path="/var/lib/kubelet/pods/9b31867d-2f52-4f4c-943a-9431cb585027/volumes" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.261219 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" path="/var/lib/kubelet/pods/ed2729b3-6b5a-4ae7-bad5-699c95dab85f/volumes" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.274490 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.774459119 +0000 UTC m=+225.232490720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.275020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.275300 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.775291759 +0000 UTC m=+225.233323370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.302554 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.374157 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.376030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.376106 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.876079047 +0000 UTC m=+225.334110588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.376396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.376438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.383196 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.395448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.480672 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.98066228 +0000 UTC m=+225.438693821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.489998 4795 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T17:21:20.806878929Z","Handler":null,"Name":""} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.494779 4795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.494811 4795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.529366 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.582222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.582325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.599956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.604016 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.633955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.641988 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:21 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:21 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:21 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.642045 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.683919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.689394 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.689422 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.722352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.725974 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:21:21 crc kubenswrapper[4795]: W0320 17:21:21.729243 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b4d98b5_0434_4a84_b890_d2428de998b7.slice/crio-2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9 WatchSource:0}: Error finding container 2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9: Status 404 returned error can't find the container with id 2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9 Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.743232 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.754658 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.816767 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.879127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"8fffbb3c7f2eb6f9ce46db083a84a057678e59722980a30f14937a525bbf5adc"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.894184 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.894795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerStarted","Data":"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerStarted","Data":"2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902442 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902768 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902969 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.905425 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.909770 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.910442 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.915181 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.921030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-454wp" podStartSLOduration=10.921006361 podStartE2EDuration="10.921006361s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:21.912734533 +0000 UTC m=+225.370766094" watchObservedRunningTime="2026-03-20 17:21:21.921006361 +0000 UTC m=+225.379037902" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.926010 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" exitCode=0 Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.926098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.926123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerStarted","Data":"352f21e959b8a9617f62fdaa474337c620b65ea35de203e2a6258d4f6ab66557"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.940419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerStarted","Data":"333adeb9b81abd47208fc6ec71e454bad1f18be9356efa101b49dd2d5983cc19"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.941564 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.953888 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.976798 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: W0320 17:21:21.994538 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3b1055_857d_4334_b39a_24b0ac9139d1.slice/crio-1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3 WatchSource:0}: Error finding container 1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3: Status 404 returned error can't find the container with id 1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.055477 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091390 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.097971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.098549 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.099904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.114763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.125954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.242825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.452723 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46046: no serving certificate available for the kubelet" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.456179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:22 crc kubenswrapper[4795]: W0320 17:21:22.464386 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5317d308_31fb_4863_bf91_5ba6a632ba67.slice/crio-27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570 WatchSource:0}: Error finding container 27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570: Status 404 returned error can't find the container with id 27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.643890 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:22 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:22 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:22 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.643943 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.771243 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.772453 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.775051 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.784736 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.902613 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.902716 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.902769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.957580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerStarted","Data":"9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.957631 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerStarted","Data":"27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.957648 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.963330 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerID="82d8d3d7e1e3eb80a041eea63e969e0d1aa9af7af1cccc7c6c9f3460b4809935" exitCode=0 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.963368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"82d8d3d7e1e3eb80a041eea63e969e0d1aa9af7af1cccc7c6c9f3460b4809935"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.963414 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerStarted","Data":"1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.966153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerStarted","Data":"76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.966189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerStarted","Data":"3db277134197ff5142f4f0d85c126502b98d3bb29670b6c4409e582bcdf40d86"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.966293 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.969036 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" exitCode=0 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.969073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.972174 4795 generic.go:334] "Generic (PLEG): container finished" podID="57849322-f280-42ee-a330-18120aeed5db" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" exitCode=0 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.972225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.975140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.985596 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" podStartSLOduration=5.985575499 podStartE2EDuration="5.985575499s" podCreationTimestamp="2026-03-20 17:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:22.978864637 +0000 UTC m=+226.436896198" watchObservedRunningTime="2026-03-20 17:21:22.985575499 +0000 UTC m=+226.443607040" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.004778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.004903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.005042 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.005310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.005406 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.025067 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.133978 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.174717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" podStartSLOduration=170.174699656 podStartE2EDuration="2m50.174699656s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:23.093883477 +0000 UTC m=+226.551915018" watchObservedRunningTime="2026-03-20 17:21:23.174699656 +0000 UTC m=+226.632731197" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.175962 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.176900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.193106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.289433 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.325991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.326083 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.326145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.427468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.427537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.427601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.428065 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.428337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.452553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.522261 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.579522 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:21:23 crc kubenswrapper[4795]: W0320 17:21:23.586880 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70000016_e928_4b11_a31d_4d08e9450a1c.slice/crio-37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257 WatchSource:0}: Error finding container 37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257: Status 404 returned error can't find the container with id 37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257 Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.641894 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:23 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:23 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:23 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.641955 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.761670 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.773372 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.792419 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.792903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.797755 4795 patch_prober.go:28] interesting pod/console-f9d7485db-hn4r8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.797799 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hn4r8" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.900915 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.916783 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.917638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.920001 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.920464 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.921823 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.991874 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.996720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:23.999903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerStarted","Data":"0bf34893ceb2a123dbae4a13fdf9053d4d9c1472bbfe52b966a8795f5fc54346"} Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.000899 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.001797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.006500 4795 generic.go:334] "Generic (PLEG): container finished" podID="70000016-e928-4b11-a31d-4d08e9450a1c" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" exitCode=0 Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.006855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25"} Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.006888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerStarted","Data":"37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257"} Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.043943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.044072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.144893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145491 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145614 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.155271 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.188548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.246736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.246823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.246904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.247505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.247661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.265057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.275550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.322563 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.375149 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.376147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.383120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.452324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.452388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.452416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.484925 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.484966 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.484986 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.485024 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.553696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.553739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.553755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.554424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.554635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.584964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.632217 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.638768 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.641386 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:24 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:24 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:24 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.641432 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.749130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.013308 4795 generic.go:334] "Generic (PLEG): container finished" podID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" exitCode=0 Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.013391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf"} Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.640950 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.643200 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.713040 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.713637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.715269 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.715880 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.730895 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.881611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.881748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.983592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.983648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.983753 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.003085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.021095 4795 generic.go:334] "Generic (PLEG): container finished" podID="918aa57e-8c94-4427-b6bd-218a5687d684" containerID="1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c" exitCode=0 Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.021156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerDied","Data":"1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c"} Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.043145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:27 crc kubenswrapper[4795]: I0320 17:21:27.196604 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45376: no serving certificate available for the kubelet" Mar 20 17:21:27 crc kubenswrapper[4795]: I0320 17:21:27.605194 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45386: no serving certificate available for the kubelet" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.462525 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.645364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"918aa57e-8c94-4427-b6bd-218a5687d684\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.645413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"918aa57e-8c94-4427-b6bd-218a5687d684\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.646720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"918aa57e-8c94-4427-b6bd-218a5687d684\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.647673 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume" (OuterVolumeSpecName: "config-volume") pod "918aa57e-8c94-4427-b6bd-218a5687d684" (UID: "918aa57e-8c94-4427-b6bd-218a5687d684"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.656086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "918aa57e-8c94-4427-b6bd-218a5687d684" (UID: "918aa57e-8c94-4427-b6bd-218a5687d684"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.656713 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h" (OuterVolumeSpecName: "kube-api-access-vzk8h") pod "918aa57e-8c94-4427-b6bd-218a5687d684" (UID: "918aa57e-8c94-4427-b6bd-218a5687d684"). InnerVolumeSpecName "kube-api-access-vzk8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.748029 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.748062 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.748073 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.824572 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:30 crc kubenswrapper[4795]: I0320 17:21:30.048424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerDied","Data":"552c19deec1ee9883f89b895bd5a9ae748bbcb5e7537b45d9e966f6c5f189edb"} Mar 20 17:21:30 crc kubenswrapper[4795]: I0320 17:21:30.048472 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552c19deec1ee9883f89b895bd5a9ae748bbcb5e7537b45d9e966f6c5f189edb" Mar 20 17:21:30 crc kubenswrapper[4795]: I0320 17:21:30.048479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:32 crc kubenswrapper[4795]: I0320 17:21:32.191265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:32 crc kubenswrapper[4795]: I0320 17:21:32.197060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:32 crc kubenswrapper[4795]: I0320 17:21:32.378408 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:33 crc kubenswrapper[4795]: I0320 17:21:33.791534 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:33 crc kubenswrapper[4795]: I0320 17:21:33.795202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:34 crc kubenswrapper[4795]: I0320 17:21:34.491166 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:34 crc kubenswrapper[4795]: I0320 17:21:34.491521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:21:35 crc kubenswrapper[4795]: W0320 17:21:35.195904 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73dd05f7_2cc4_4a99_b12d_26e4d436acca.slice/crio-d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840 WatchSource:0}: Error finding container d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840: Status 404 returned error can't find the container with id d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840 Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.080483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerStarted","Data":"d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840"} Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.443107 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.443307 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" containerID="cri-o://9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6" gracePeriod=30 Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.451466 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.451756 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" containerID="cri-o://fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5" gracePeriod=30 Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.087103 4795 generic.go:334] "Generic (PLEG): container finished" podID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerID="9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6" exitCode=0 Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.087202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerDied","Data":"9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6"} Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.093208 4795 generic.go:334] "Generic (PLEG): container finished" podID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerID="fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5" exitCode=0 Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.093245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerDied","Data":"fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5"} Mar 20 17:21:39 crc kubenswrapper[4795]: I0320 17:21:39.673129 4795 patch_prober.go:28] interesting pod/route-controller-manager-7fb98dc7f7-g49lt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 17:21:39 crc kubenswrapper[4795]: I0320 17:21:39.673231 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 17:21:41 crc kubenswrapper[4795]: I0320 17:21:41.300265 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:21:41 crc kubenswrapper[4795]: I0320 17:21:41.300363 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:21:41 crc kubenswrapper[4795]: I0320 17:21:41.823431 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:42 crc kubenswrapper[4795]: I0320 17:21:42.244045 4795 patch_prober.go:28] interesting pod/controller-manager-bbf9678f8-ftr7c container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 20 17:21:42 crc kubenswrapper[4795]: I0320 17:21:42.244108 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 20 17:21:42 crc kubenswrapper[4795]: E0320 17:21:42.970903 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 17:21:42 crc kubenswrapper[4795]: E0320 17:21:42.971368 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w2vjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kzvch_openshift-marketplace(fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:42 crc kubenswrapper[4795]: E0320 17:21:42.973240 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kzvch" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" Mar 20 17:21:43 crc kubenswrapper[4795]: I0320 17:21:43.237540 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:21:43 crc kubenswrapper[4795]: I0320 17:21:43.347261 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:21:43 crc kubenswrapper[4795]: I0320 17:21:43.358750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.016149 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kzvch" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.240251 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.240752 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2qsng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hv8kd_openshift-marketplace(7b4d98b5-0434-4a84-b890-d2428de998b7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.242024 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hv8kd" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.105607 4795 ???:1] "http: TLS handshake error from 192.168.126.11:39038: no serving certificate available for the kubelet" Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.736247 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hv8kd" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" Mar 20 17:21:48 crc kubenswrapper[4795]: W0320 17:21:48.738520 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8dc34161_d5d0_4580_88a1_c5e2b55c924d.slice/crio-72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6 WatchSource:0}: Error finding container 72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6: Status 404 returned error can't find the container with id 72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6 Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.890290 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.910988 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.912780 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpp4c"] Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924273 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.924525 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924602 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.924620 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" containerName="collect-profiles" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924778 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" containerName="collect-profiles" Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.924796 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924806 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925325 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925345 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925362 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" containerName="collect-profiles" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925976 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:48 crc kubenswrapper[4795]: W0320 17:21:48.936420 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996ef79e_1d5b_4e1b_b1f0_efd1ca2c9a77.slice/crio-e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd WatchSource:0}: Error finding container e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd: Status 404 returned error can't find the container with id e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.938786 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040881 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041190 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043046 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca" (OuterVolumeSpecName: "client-ca") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043058 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config" (OuterVolumeSpecName: "config") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config" (OuterVolumeSpecName: "config") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043918 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046618 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046658 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9" (OuterVolumeSpecName: "kube-api-access-jc2m9") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "kube-api-access-jc2m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046941 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px" (OuterVolumeSpecName: "kube-api-access-ww6px") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "kube-api-access-ww6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142249 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142358 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142369 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142378 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142389 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142398 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142406 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142414 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142424 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142432 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.143087 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.143477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.146207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.159522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.159846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdc3472c-58bf-4b57-aa00-34677fc42e06","Type":"ContainerStarted","Data":"cdf769cace6742ee2c32cad7bdf9fa8b4a0644fb379f47fb111fd1d463a2ab13"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.162587 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.162580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerDied","Data":"6092fd1860a57c1ca6f62820e564b6ec08d02b8c7829faf4d750183f7837476f"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.163001 4795 scope.go:117] "RemoveContainer" containerID="fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.164736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" event={"ID":"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77","Type":"ContainerStarted","Data":"e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.167488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc34161-d5d0-4580-88a1-c5e2b55c924d","Type":"ContainerStarted","Data":"72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.168952 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerDied","Data":"27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.169028 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.172128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerStarted","Data":"d58b7f5b37a6a35ef39d3d8b6ebcad0e2da7e5425eb92dcc45ec68fe40722e18"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.190579 4795 scope.go:117] "RemoveContainer" containerID="9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.196436 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.199282 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.209060 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.213844 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.263523 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" path="/var/lib/kubelet/pods/4c425d50-cbc6-4fa3-b286-ef1b8d696198/volumes" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.264277 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" path="/var/lib/kubelet/pods/5317d308-31fb-4863-bf91-5ba6a632ba67/volumes" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.304524 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.440269 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.440730 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgvm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ht4zv_openshift-marketplace(70000016-e928-4b11-a31d-4d08e9450a1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.441942 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ht4zv" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.550573 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.689065 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.689233 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6zsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kk5rk_openshift-marketplace(57849322-f280-42ee-a330-18120aeed5db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.691228 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kk5rk" podUID="57849322-f280-42ee-a330-18120aeed5db" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.763805 4795 csr.go:261] certificate signing request csr-8w7w5 is approved, waiting to be issued Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.771003 4795 csr.go:257] certificate signing request csr-8w7w5 is issued Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.180208 4795 generic.go:334] "Generic (PLEG): container finished" podID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.180564 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.185209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerStarted","Data":"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.189442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" event={"ID":"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77","Type":"ContainerStarted","Data":"54240c5223551154fcb65b1918dadaadd6a2548bed4a555bb8ff6486ab7d05e0"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.189484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" event={"ID":"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77","Type":"ContainerStarted","Data":"b089f2b080867dc9421a7a3c2eaaa26fac7817cf5a7db10470ab8773973f1414"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.193610 4795 generic.go:334] "Generic (PLEG): container finished" podID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerID="76aa98549ce46db60ce0a3b7fd4c6b9ed28e4c1b7375fc84abcdb33fcf4ef287" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.193727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-j7789" event={"ID":"bed1d31b-b060-45c3-95bf-3b226a36efe1","Type":"ContainerDied","Data":"76aa98549ce46db60ce0a3b7fd4c6b9ed28e4c1b7375fc84abcdb33fcf4ef287"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.201400 4795 generic.go:334] "Generic (PLEG): container finished" podID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerID="898e9ddb331a961041951b7bb1edfb2abf5db69d1009da036bfe796e8579e1e3" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.201452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"898e9ddb331a961041951b7bb1edfb2abf5db69d1009da036bfe796e8579e1e3"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.213968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerStarted","Data":"7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.218143 4795 generic.go:334] "Generic (PLEG): container finished" podID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerID="b2343ae78e6e4067fd48d7e9e2379803526af28d2e60ae47d7c80ca51f8a9546" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.218207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdc3472c-58bf-4b57-aa00-34677fc42e06","Type":"ContainerDied","Data":"b2343ae78e6e4067fd48d7e9e2379803526af28d2e60ae47d7c80ca51f8a9546"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.219931 4795 generic.go:334] "Generic (PLEG): container finished" podID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerID="6136ff7225ac1cecd4f6d0e6322199086bbed28ae9106bcb0c283d76cfa16319" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.219973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc34161-d5d0-4580-88a1-c5e2b55c924d","Type":"ContainerDied","Data":"6136ff7225ac1cecd4f6d0e6322199086bbed28ae9106bcb0c283d76cfa16319"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.222391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerStarted","Data":"de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.222551 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.222565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerStarted","Data":"d97024fb9f1af8ccac26b9f9b512b0db9ba1fd92cda4860946d1b691addcb115"} Mar 20 17:21:50 crc kubenswrapper[4795]: E0320 17:21:50.225215 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kk5rk" podUID="57849322-f280-42ee-a330-18120aeed5db" Mar 20 17:21:50 crc kubenswrapper[4795]: E0320 17:21:50.228549 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ht4zv" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.229656 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jpp4c" podStartSLOduration=197.229638693 podStartE2EDuration="3m17.229638693s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:50.215037768 +0000 UTC m=+253.673069309" watchObservedRunningTime="2026-03-20 17:21:50.229638693 +0000 UTC m=+253.687670244" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.235530 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.361747 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" podStartSLOduration=14.361733688 podStartE2EDuration="14.361733688s" podCreationTimestamp="2026-03-20 17:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:50.358321055 +0000 UTC m=+253.816352586" watchObservedRunningTime="2026-03-20 17:21:50.361733688 +0000 UTC m=+253.819765229" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.772384 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 22:59:29.608681624 +0000 UTC Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.772424 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6869h37m38.836260843s for next certificate rotation Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.230952 4795 generic.go:334] "Generic (PLEG): container finished" podID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" exitCode=0 Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.231041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d"} Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.232580 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerID="7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc" exitCode=0 Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.233432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc"} Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.529227 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.607592 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.615714 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.675371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"cdc3472c-58bf-4b57-aa00-34677fc42e06\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.675521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"cdc3472c-58bf-4b57-aa00-34677fc42e06\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.675872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cdc3472c-58bf-4b57-aa00-34677fc42e06" (UID: "cdc3472c-58bf-4b57-aa00-34677fc42e06"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.681073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cdc3472c-58bf-4b57-aa00-34677fc42e06" (UID: "cdc3472c-58bf-4b57-aa00-34677fc42e06"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"bed1d31b-b060-45c3-95bf-3b226a36efe1\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776406 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776452 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776713 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776712 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8dc34161-d5d0-4580-88a1-c5e2b55c924d" (UID: "8dc34161-d5d0-4580-88a1-c5e2b55c924d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776730 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.781145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8dc34161-d5d0-4580-88a1-c5e2b55c924d" (UID: "8dc34161-d5d0-4580-88a1-c5e2b55c924d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.781698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5" (OuterVolumeSpecName: "kube-api-access-9fwn5") pod "bed1d31b-b060-45c3-95bf-3b226a36efe1" (UID: "bed1d31b-b060-45c3-95bf-3b226a36efe1"). InnerVolumeSpecName "kube-api-access-9fwn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.878097 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.878132 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.878146 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917317 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:51 crc kubenswrapper[4795]: E0320 17:21:51.917528 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917540 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: E0320 17:21:51.917549 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917556 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: E0320 17:21:51.917569 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerName="oc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917575 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerName="oc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917701 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerName="oc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917711 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.918046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.920919 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921034 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921137 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921311 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921345 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.925424 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.929945 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.080862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.080908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.080975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.081027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.081072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.183864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.184913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.186531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.187155 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.198347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.242574 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.247674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerStarted","Data":"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.249328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.249372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc34161-d5d0-4580-88a1-c5e2b55c924d","Type":"ContainerDied","Data":"72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.249454 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-j7789" event={"ID":"bed1d31b-b060-45c3-95bf-3b226a36efe1","Type":"ContainerDied","Data":"dd9d5f9731ec60032210cdc180eb41d5f236e29f6e6729daa332365615c09023"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262184 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd9d5f9731ec60032210cdc180eb41d5f236e29f6e6729daa332365615c09023" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262223 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262853 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4492" podStartSLOduration=6.918039605 podStartE2EDuration="29.262836416s" podCreationTimestamp="2026-03-20 17:21:23 +0000 UTC" firstStartedPulling="2026-03-20 17:21:29.403385631 +0000 UTC m=+232.861417172" lastFinishedPulling="2026-03-20 17:21:51.748182442 +0000 UTC m=+255.206213983" observedRunningTime="2026-03-20 17:21:52.260931898 +0000 UTC m=+255.718963439" watchObservedRunningTime="2026-03-20 17:21:52.262836416 +0000 UTC m=+255.720867957" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.266577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerStarted","Data":"ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.271364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdc3472c-58bf-4b57-aa00-34677fc42e06","Type":"ContainerDied","Data":"cdf769cace6742ee2c32cad7bdf9fa8b4a0644fb379f47fb111fd1d463a2ab13"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.271390 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.271392 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf769cace6742ee2c32cad7bdf9fa8b4a0644fb379f47fb111fd1d463a2ab13" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.289258 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l6vnf" podStartSLOduration=2.56873014 podStartE2EDuration="31.289234067s" podCreationTimestamp="2026-03-20 17:21:21 +0000 UTC" firstStartedPulling="2026-03-20 17:21:22.9656045 +0000 UTC m=+226.423636051" lastFinishedPulling="2026-03-20 17:21:51.686108437 +0000 UTC m=+255.144139978" observedRunningTime="2026-03-20 17:21:52.279489036 +0000 UTC m=+255.737520567" watchObservedRunningTime="2026-03-20 17:21:52.289234067 +0000 UTC m=+255.747265608" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.463199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:52 crc kubenswrapper[4795]: W0320 17:21:52.474446 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb9668af_0fcd_484b_a4dd_929c06088636.slice/crio-9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da WatchSource:0}: Error finding container 9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da: Status 404 returned error can't find the container with id 9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.282146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerStarted","Data":"75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825"} Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.282428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerStarted","Data":"9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da"} Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.296265 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" podStartSLOduration=17.296234513 podStartE2EDuration="17.296234513s" podCreationTimestamp="2026-03-20 17:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:53.295168914 +0000 UTC m=+256.753200455" watchObservedRunningTime="2026-03-20 17:21:53.296234513 +0000 UTC m=+256.754266054" Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.525566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.525984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.287350 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.292214 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.685370 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-x4492" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" probeResult="failure" output=< Mar 20 17:21:54 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:21:54 crc kubenswrapper[4795]: > Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.693182 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:56 crc kubenswrapper[4795]: I0320 17:21:56.461379 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:56 crc kubenswrapper[4795]: I0320 17:21:56.544904 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:56 crc kubenswrapper[4795]: I0320 17:21:56.545487 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" containerID="cri-o://de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0" gracePeriod=30 Mar 20 17:21:57 crc kubenswrapper[4795]: I0320 17:21:57.307336 4795 generic.go:334] "Generic (PLEG): container finished" podID="af030096-8488-42df-be2c-a39b58ff0612" containerID="de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0" exitCode=0 Mar 20 17:21:57 crc kubenswrapper[4795]: I0320 17:21:57.307429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerDied","Data":"de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0"} Mar 20 17:21:57 crc kubenswrapper[4795]: I0320 17:21:57.307915 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" containerID="cri-o://75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825" gracePeriod=30 Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.314289 4795 generic.go:334] "Generic (PLEG): container finished" podID="cb9668af-0fcd-484b-a4dd-929c06088636" containerID="75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825" exitCode=0 Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.314339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerDied","Data":"75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825"} Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.511055 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.511921 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.513406 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.513893 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.523313 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.673824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.673930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.775442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.775521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.775740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.795223 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.843727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:59 crc kubenswrapper[4795]: I0320 17:21:59.305502 4795 patch_prober.go:28] interesting pod/route-controller-manager-b7bc44c6c-lndxp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Mar 20 17:21:59 crc kubenswrapper[4795]: I0320 17:21:59.305580 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.127797 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.131109 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.131199 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.134351 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.134421 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.134621 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.295434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"auto-csr-approver-29567122-fns4l\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.296016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.324239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerDied","Data":"d97024fb9f1af8ccac26b9f9b512b0db9ba1fd92cda4860946d1b691addcb115"} Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.324293 4795 scope.go:117] "RemoveContainer" containerID="de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.324402 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerStarted","Data":"ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b"} Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327463 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:00 crc kubenswrapper[4795]: E0320 17:22:00.327737 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327750 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327852 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.328190 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.337358 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.343534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerStarted","Data":"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa"} Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.368897 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396165 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"auto-csr-approver-29567122-fns4l\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.397515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca" (OuterVolumeSpecName: "client-ca") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.397570 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config" (OuterVolumeSpecName: "config") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.405657 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r" (OuterVolumeSpecName: "kube-api-access-69x8r") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "kube-api-access-69x8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.415059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"auto-csr-approver-29567122-fns4l\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.419956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.464112 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.489916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.498981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499283 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499624 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499634 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499643 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499651 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.500956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.501306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca" (OuterVolumeSpecName: "client-ca") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.501759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config" (OuterVolumeSpecName: "config") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.509846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx" (OuterVolumeSpecName: "kube-api-access-57hzx") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "kube-api-access-57hzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.509948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603662 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603700 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603717 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603728 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603744 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.622730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.623285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.623594 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.626478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.645061 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.666709 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.674827 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.824587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:00 crc kubenswrapper[4795]: W0320 17:22:00.831157 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26689f8_7057_45ba_8d53_ae4623ecd2e9.slice/crio-48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b WatchSource:0}: Error finding container 48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b: Status 404 returned error can't find the container with id 48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.901201 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:22:00 crc kubenswrapper[4795]: W0320 17:22:00.905736 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0486c12_c384_46ff_925b_bfeefb1d59bb.slice/crio-265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a WatchSource:0}: Error finding container 265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a: Status 404 returned error can't find the container with id 265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.269302 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af030096-8488-42df-be2c-a39b58ff0612" path="/var/lib/kubelet/pods/af030096-8488-42df-be2c-a39b58ff0612/volumes" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.351586 4795 generic.go:334] "Generic (PLEG): container finished" podID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerID="ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b" exitCode=0 Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.351783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.353958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerDied","Data":"9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.354042 4795 scope.go:117] "RemoveContainer" containerID="75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.353970 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.355852 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-fns4l" event={"ID":"a0486c12-c384-46ff-925b-bfeefb1d59bb","Type":"ContainerStarted","Data":"265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.358632 4795 generic.go:334] "Generic (PLEG): container finished" podID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" exitCode=0 Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.358746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.362804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerStarted","Data":"1165a4e47d8fe5ba2e9d8af8b5728fb3bb7d991766808000e075dbc69758730b"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.362849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerStarted","Data":"c56b30507509c740a94c11008112ae36e1c890764e38285ed7897648547f8686"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.365053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerStarted","Data":"48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.401645 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.401625939 podStartE2EDuration="3.401625939s" podCreationTimestamp="2026-03-20 17:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:01.398618575 +0000 UTC m=+264.856650156" watchObservedRunningTime="2026-03-20 17:22:01.401625939 +0000 UTC m=+264.859657490" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.425185 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.435879 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.744317 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.744352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.809156 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.374714 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerStarted","Data":"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f"} Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.375301 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.408923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerStarted","Data":"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741"} Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.419202 4795 generic.go:334] "Generic (PLEG): container finished" podID="324189bb-8d17-4759-8902-0e960316a64b" containerID="1165a4e47d8fe5ba2e9d8af8b5728fb3bb7d991766808000e075dbc69758730b" exitCode=0 Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.419319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerDied","Data":"1165a4e47d8fe5ba2e9d8af8b5728fb3bb7d991766808000e075dbc69758730b"} Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.427174 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" podStartSLOduration=6.427158685 podStartE2EDuration="6.427158685s" podCreationTimestamp="2026-03-20 17:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:02.398065852 +0000 UTC m=+265.856097493" watchObservedRunningTime="2026-03-20 17:22:02.427158685 +0000 UTC m=+265.885190236" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.460112 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.876718 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.923662 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:02 crc kubenswrapper[4795]: E0320 17:22:02.923892 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.923905 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.924001 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.924316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.937228 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.937734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.937860 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.938795 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.941231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.941819 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.942087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.944526 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146651 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.147837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.147947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.148277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.154469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.162540 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.251677 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.259068 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" path="/var/lib/kubelet/pods/cb9668af-0fcd-484b-a4dd-929c06088636/volumes" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.308584 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.309279 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.322345 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.426802 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerStarted","Data":"c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.428817 4795 generic.go:334] "Generic (PLEG): container finished" podID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerID="14e15a12796f646063cb5f653e99e6ad23f1724726dfb97b08e9621c085665c1" exitCode=0 Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.428918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-fns4l" event={"ID":"a0486c12-c384-46ff-925b-bfeefb1d59bb","Type":"ContainerDied","Data":"14e15a12796f646063cb5f653e99e6ad23f1724726dfb97b08e9621c085665c1"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.431894 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" exitCode=0 Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.431940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.448696 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" exitCode=0 Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.448776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.450195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.450282 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.450315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.453892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerStarted","Data":"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.466659 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2m5ld" podStartSLOduration=27.323917933 podStartE2EDuration="39.466639696s" podCreationTimestamp="2026-03-20 17:21:24 +0000 UTC" firstStartedPulling="2026-03-20 17:21:50.206551043 +0000 UTC m=+253.664582584" lastFinishedPulling="2026-03-20 17:22:02.349272806 +0000 UTC m=+265.807304347" observedRunningTime="2026-03-20 17:22:03.44970783 +0000 UTC m=+266.907739371" watchObservedRunningTime="2026-03-20 17:22:03.466639696 +0000 UTC m=+266.924671237" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.525096 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q7czt" podStartSLOduration=28.374026407 podStartE2EDuration="40.525074661s" podCreationTimestamp="2026-03-20 17:21:23 +0000 UTC" firstStartedPulling="2026-03-20 17:21:50.182110852 +0000 UTC m=+253.640142383" lastFinishedPulling="2026-03-20 17:22:02.333159096 +0000 UTC m=+265.791190637" observedRunningTime="2026-03-20 17:22:03.521043676 +0000 UTC m=+266.979075227" watchObservedRunningTime="2026-03-20 17:22:03.525074661 +0000 UTC m=+266.983106212" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.551523 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.551789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.551979 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.552095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.552100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.564877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.571736 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.619222 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.667800 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.673333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.684243 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:03 crc kubenswrapper[4795]: W0320 17:22:03.687773 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85b922ea_7281_44f5_b78b_b0ec5d5387d3.slice/crio-7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b WatchSource:0}: Error finding container 7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b: Status 404 returned error can't find the container with id 7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.758936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"324189bb-8d17-4759-8902-0e960316a64b\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.759268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"324189bb-8d17-4759-8902-0e960316a64b\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.759068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "324189bb-8d17-4759-8902-0e960316a64b" (UID: "324189bb-8d17-4759-8902-0e960316a64b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.766436 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "324189bb-8d17-4759-8902-0e960316a64b" (UID: "324189bb-8d17-4759-8902-0e960316a64b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.861858 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.861889 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.139782 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:22:04 crc kubenswrapper[4795]: W0320 17:22:04.181916 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod49054187_cb30_4f07_b67a_794c2503f50a.slice/crio-bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa WatchSource:0}: Error finding container bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa: Status 404 returned error can't find the container with id bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.323878 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.323928 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.461348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerDied","Data":"c56b30507509c740a94c11008112ae36e1c890764e38285ed7897648547f8686"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.461388 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56b30507509c740a94c11008112ae36e1c890764e38285ed7897648547f8686" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.462412 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.463291 4795 generic.go:334] "Generic (PLEG): container finished" podID="70000016-e928-4b11-a31d-4d08e9450a1c" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" exitCode=0 Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.463343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.465111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerStarted","Data":"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.465153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerStarted","Data":"7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.465497 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.466528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerStarted","Data":"bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.468167 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerStarted","Data":"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.469753 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.470961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerStarted","Data":"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.491919 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" podStartSLOduration=8.491904795 podStartE2EDuration="8.491904795s" podCreationTimestamp="2026-03-20 17:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:04.490265394 +0000 UTC m=+267.948296935" watchObservedRunningTime="2026-03-20 17:22:04.491904795 +0000 UTC m=+267.949936346" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.517655 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kzvch" podStartSLOduration=2.295216382 podStartE2EDuration="44.517642095s" podCreationTimestamp="2026-03-20 17:21:20 +0000 UTC" firstStartedPulling="2026-03-20 17:21:21.938448969 +0000 UTC m=+225.396480510" lastFinishedPulling="2026-03-20 17:22:04.160874682 +0000 UTC m=+267.618906223" observedRunningTime="2026-03-20 17:22:04.514996542 +0000 UTC m=+267.973028083" watchObservedRunningTime="2026-03-20 17:22:04.517642095 +0000 UTC m=+267.975673636" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.542619 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hv8kd" podStartSLOduration=2.648201927 podStartE2EDuration="43.5426005s" podCreationTimestamp="2026-03-20 17:21:21 +0000 UTC" firstStartedPulling="2026-03-20 17:21:22.971895947 +0000 UTC m=+226.429927488" lastFinishedPulling="2026-03-20 17:22:03.86629452 +0000 UTC m=+267.324326061" observedRunningTime="2026-03-20 17:22:04.542330231 +0000 UTC m=+268.000361772" watchObservedRunningTime="2026-03-20 17:22:04.5426005 +0000 UTC m=+268.000632031" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.724041 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.749667 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.749733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.882021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"a0486c12-c384-46ff-925b-bfeefb1d59bb\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.887965 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2" (OuterVolumeSpecName: "kube-api-access-tcjr2") pod "a0486c12-c384-46ff-925b-bfeefb1d59bb" (UID: "a0486c12-c384-46ff-925b-bfeefb1d59bb"). InnerVolumeSpecName "kube-api-access-tcjr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.983841 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.362641 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q7czt" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" probeResult="failure" output=< Mar 20 17:22:05 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:22:05 crc kubenswrapper[4795]: > Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.408506 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.408911 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l6vnf" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" containerID="cri-o://ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10" gracePeriod=2 Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.477378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-fns4l" event={"ID":"a0486c12-c384-46ff-925b-bfeefb1d59bb","Type":"ContainerDied","Data":"265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.477424 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.477488 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.478918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerStarted","Data":"30207c849aa57355d0d1027a4dded499a63ab8fad8a6d8162ced0752bd75a382"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.482981 4795 generic.go:334] "Generic (PLEG): container finished" podID="57849322-f280-42ee-a330-18120aeed5db" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" exitCode=0 Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.483020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.486871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerStarted","Data":"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.498554 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.498530774 podStartE2EDuration="2.498530774s" podCreationTimestamp="2026-03-20 17:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:05.49740498 +0000 UTC m=+268.955436521" watchObservedRunningTime="2026-03-20 17:22:05.498530774 +0000 UTC m=+268.956562315" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.519834 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ht4zv" podStartSLOduration=2.705297964 podStartE2EDuration="43.519816076s" podCreationTimestamp="2026-03-20 17:21:22 +0000 UTC" firstStartedPulling="2026-03-20 17:21:24.059380699 +0000 UTC m=+227.517412240" lastFinishedPulling="2026-03-20 17:22:04.873898811 +0000 UTC m=+268.331930352" observedRunningTime="2026-03-20 17:22:05.517457423 +0000 UTC m=+268.975488964" watchObservedRunningTime="2026-03-20 17:22:05.519816076 +0000 UTC m=+268.977847617" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.787948 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2m5ld" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" probeResult="failure" output=< Mar 20 17:22:05 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:22:05 crc kubenswrapper[4795]: > Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.493356 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerID="ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10" exitCode=0 Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.494279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10"} Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.689320 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.806534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"9b3b1055-857d-4334-b39a-24b0ac9139d1\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.806594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"9b3b1055-857d-4334-b39a-24b0ac9139d1\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.806674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"9b3b1055-857d-4334-b39a-24b0ac9139d1\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.807475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities" (OuterVolumeSpecName: "utilities") pod "9b3b1055-857d-4334-b39a-24b0ac9139d1" (UID: "9b3b1055-857d-4334-b39a-24b0ac9139d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.811940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54" (OuterVolumeSpecName: "kube-api-access-vks54") pod "9b3b1055-857d-4334-b39a-24b0ac9139d1" (UID: "9b3b1055-857d-4334-b39a-24b0ac9139d1"). InnerVolumeSpecName "kube-api-access-vks54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.857480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b3b1055-857d-4334-b39a-24b0ac9139d1" (UID: "9b3b1055-857d-4334-b39a-24b0ac9139d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.907645 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.907692 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.907704 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.502080 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.502087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3"} Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.502246 4795 scope.go:117] "RemoveContainer" containerID="ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.504625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerStarted","Data":"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233"} Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.522191 4795 scope.go:117] "RemoveContainer" containerID="7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.529626 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kk5rk" podStartSLOduration=3.468618398 podStartE2EDuration="47.529602088s" podCreationTimestamp="2026-03-20 17:21:20 +0000 UTC" firstStartedPulling="2026-03-20 17:21:22.97421773 +0000 UTC m=+226.432249271" lastFinishedPulling="2026-03-20 17:22:07.03520142 +0000 UTC m=+270.493232961" observedRunningTime="2026-03-20 17:22:07.524917052 +0000 UTC m=+270.982948593" watchObservedRunningTime="2026-03-20 17:22:07.529602088 +0000 UTC m=+270.987633629" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.540508 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.541475 4795 scope.go:117] "RemoveContainer" containerID="82d8d3d7e1e3eb80a041eea63e969e0d1aa9af7af1cccc7c6c9f3460b4809935" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.547068 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.609110 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.609378 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4492" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" containerID="cri-o://89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" gracePeriod=2 Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.121128 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.269988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"366eee86-1ca2-4662-b32d-c00d4c1d513f\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.270082 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"366eee86-1ca2-4662-b32d-c00d4c1d513f\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.270104 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"366eee86-1ca2-4662-b32d-c00d4c1d513f\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.271011 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities" (OuterVolumeSpecName: "utilities") pod "366eee86-1ca2-4662-b32d-c00d4c1d513f" (UID: "366eee86-1ca2-4662-b32d-c00d4c1d513f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.276779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp" (OuterVolumeSpecName: "kube-api-access-js4cp") pod "366eee86-1ca2-4662-b32d-c00d4c1d513f" (UID: "366eee86-1ca2-4662-b32d-c00d4c1d513f"). InnerVolumeSpecName "kube-api-access-js4cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.301954 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "366eee86-1ca2-4662-b32d-c00d4c1d513f" (UID: "366eee86-1ca2-4662-b32d-c00d4c1d513f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.371189 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.371221 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.371231 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511059 4795 generic.go:334] "Generic (PLEG): container finished" podID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" exitCode=0 Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511103 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784"} Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"0bf34893ceb2a123dbae4a13fdf9053d4d9c1472bbfe52b966a8795f5fc54346"} Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511344 4795 scope.go:117] "RemoveContainer" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.525718 4795 scope.go:117] "RemoveContainer" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.537073 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.543800 4795 scope.go:117] "RemoveContainer" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.545680 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.556608 4795 scope.go:117] "RemoveContainer" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" Mar 20 17:22:08 crc kubenswrapper[4795]: E0320 17:22:08.557014 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784\": container with ID starting with 89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784 not found: ID does not exist" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557043 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784"} err="failed to get container status \"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784\": rpc error: code = NotFound desc = could not find container \"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784\": container with ID starting with 89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784 not found: ID does not exist" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557064 4795 scope.go:117] "RemoveContainer" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" Mar 20 17:22:08 crc kubenswrapper[4795]: E0320 17:22:08.557366 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d\": container with ID starting with f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d not found: ID does not exist" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557428 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d"} err="failed to get container status \"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d\": rpc error: code = NotFound desc = could not find container \"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d\": container with ID starting with f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d not found: ID does not exist" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557456 4795 scope.go:117] "RemoveContainer" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" Mar 20 17:22:08 crc kubenswrapper[4795]: E0320 17:22:08.557841 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf\": container with ID starting with 356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf not found: ID does not exist" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557893 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf"} err="failed to get container status \"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf\": rpc error: code = NotFound desc = could not find container \"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf\": container with ID starting with 356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf not found: ID does not exist" Mar 20 17:22:09 crc kubenswrapper[4795]: I0320 17:22:09.258725 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" path="/var/lib/kubelet/pods/366eee86-1ca2-4662-b32d-c00d4c1d513f/volumes" Mar 20 17:22:09 crc kubenswrapper[4795]: I0320 17:22:09.259674 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" path="/var/lib/kubelet/pods/9b3b1055-857d-4334-b39a-24b0ac9139d1/volumes" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.105278 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.105331 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.171189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.300708 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.300769 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.300817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.301209 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.301271 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506" gracePeriod=600 Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.303623 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.304019 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.357175 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.529945 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.530593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.571237 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.580372 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:12 crc kubenswrapper[4795]: I0320 17:22:12.551940 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506" exitCode=0 Mar 20 17:22:12 crc kubenswrapper[4795]: I0320 17:22:12.552018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506"} Mar 20 17:22:12 crc kubenswrapper[4795]: I0320 17:22:12.612076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.135399 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.135764 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.191321 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.558828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b"} Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.608638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.013508 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.368536 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.468073 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.801566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.852001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:15 crc kubenswrapper[4795]: I0320 17:22:15.565865 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hv8kd" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" containerID="cri-o://a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" gracePeriod=2 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.060049 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.103899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"7b4d98b5-0434-4a84-b890-d2428de998b7\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.104000 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"7b4d98b5-0434-4a84-b890-d2428de998b7\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.104144 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"7b4d98b5-0434-4a84-b890-d2428de998b7\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.108837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities" (OuterVolumeSpecName: "utilities") pod "7b4d98b5-0434-4a84-b890-d2428de998b7" (UID: "7b4d98b5-0434-4a84-b890-d2428de998b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.109837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng" (OuterVolumeSpecName: "kube-api-access-2qsng") pod "7b4d98b5-0434-4a84-b890-d2428de998b7" (UID: "7b4d98b5-0434-4a84-b890-d2428de998b7"). InnerVolumeSpecName "kube-api-access-2qsng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.155056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b4d98b5-0434-4a84-b890-d2428de998b7" (UID: "7b4d98b5-0434-4a84-b890-d2428de998b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.205893 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.205929 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.205939 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.463877 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.464459 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" containerID="cri-o://9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" gracePeriod=30 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.481123 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.481411 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" containerID="cri-o://46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" gracePeriod=30 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572637 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" exitCode=0 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a"} Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9"} Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572741 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572764 4795 scope.go:117] "RemoveContainer" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.697348 4795 scope.go:117] "RemoveContainer" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.723771 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.726913 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.741811 4795 scope.go:117] "RemoveContainer" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.753407 4795 scope.go:117] "RemoveContainer" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" Mar 20 17:22:16 crc kubenswrapper[4795]: E0320 17:22:16.753749 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a\": container with ID starting with a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a not found: ID does not exist" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.753788 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a"} err="failed to get container status \"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a\": rpc error: code = NotFound desc = could not find container \"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a\": container with ID starting with a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a not found: ID does not exist" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.753815 4795 scope.go:117] "RemoveContainer" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" Mar 20 17:22:16 crc kubenswrapper[4795]: E0320 17:22:16.754164 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c\": container with ID starting with 383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c not found: ID does not exist" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.754191 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c"} err="failed to get container status \"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c\": rpc error: code = NotFound desc = could not find container \"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c\": container with ID starting with 383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c not found: ID does not exist" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.754207 4795 scope.go:117] "RemoveContainer" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" Mar 20 17:22:16 crc kubenswrapper[4795]: E0320 17:22:16.754475 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6\": container with ID starting with 051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6 not found: ID does not exist" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.754492 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6"} err="failed to get container status \"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6\": rpc error: code = NotFound desc = could not find container \"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6\": container with ID starting with 051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6 not found: ID does not exist" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.023830 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.055783 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119080 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119202 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119705 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119887 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119901 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config" (OuterVolumeSpecName: "config") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.120041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config" (OuterVolumeSpecName: "config") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.123410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.123416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.123515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh" (OuterVolumeSpecName: "kube-api-access-tljdh") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "kube-api-access-tljdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.127807 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls" (OuterVolumeSpecName: "kube-api-access-bdfls") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "kube-api-access-bdfls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220829 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220865 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220876 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220885 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220893 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220902 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220909 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220917 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220926 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.262001 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" path="/var/lib/kubelet/pods/7b4d98b5-0434-4a84-b890-d2428de998b7/volumes" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580175 4795 generic.go:334] "Generic (PLEG): container finished" podID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" exitCode=0 Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerDied","Data":"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerDied","Data":"48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580670 4795 scope.go:117] "RemoveContainer" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585738 4795 generic.go:334] "Generic (PLEG): container finished" podID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" exitCode=0 Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585812 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerDied","Data":"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerDied","Data":"7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.609822 4795 scope.go:117] "RemoveContainer" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.611312 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f\": container with ID starting with 46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f not found: ID does not exist" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.611386 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f"} err="failed to get container status \"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f\": rpc error: code = NotFound desc = could not find container \"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f\": container with ID starting with 46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f not found: ID does not exist" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.611443 4795 scope.go:117] "RemoveContainer" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.624260 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.629082 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.638896 4795 scope.go:117] "RemoveContainer" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.639460 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b\": container with ID starting with 9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b not found: ID does not exist" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.639491 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b"} err="failed to get container status \"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b\": rpc error: code = NotFound desc = could not find container \"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b\": container with ID starting with 9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b not found: ID does not exist" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.645843 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.649794 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.942839 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.943409 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.943815 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.944269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324189bb-8d17-4759-8902-0e960316a64b" containerName="pruner" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.944450 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="324189bb-8d17-4759-8902-0e960316a64b" containerName="pruner" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.944603 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.944756 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.944880 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.945031 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.946509 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.946883 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.947094 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.947591 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.947928 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerName="oc" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.948137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerName="oc" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.948587 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.949929 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.950150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.951105 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.951312 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.951517 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.951769 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.952269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952377 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.952488 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952597 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952978 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953115 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953417 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerName="oc" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953527 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="324189bb-8d17-4759-8902-0e960316a64b" containerName="pruner" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953644 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953833 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.954748 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.954934 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.955893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.955934 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.956017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.957086 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.957991 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958517 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958717 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958967 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959069 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959169 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959536 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959628 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.960070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.960185 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.964587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028671 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028733 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028820 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028960 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.130827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.130932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.130972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131170 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131329 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.132803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.132847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.133089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.133324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.137024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.137730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.139673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.162388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.163844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.324445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.333306 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.408872 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.409199 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2m5ld" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" containerID="cri-o://c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a" gracePeriod=2 Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.592704 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.598624 4795 generic.go:334] "Generic (PLEG): container finished" podID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerID="c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a" exitCode=0 Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.598677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a"} Mar 20 17:22:18 crc kubenswrapper[4795]: W0320 17:22:18.611844 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d WatchSource:0}: Error finding container 291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d: Status 404 returned error can't find the container with id 291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.769332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.770463 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:18 crc kubenswrapper[4795]: W0320 17:22:18.776809 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7 WatchSource:0}: Error finding container acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7: Status 404 returned error can't find the container with id acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7 Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.938876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"58cc2d60-9778-460a-bd81-89c8078a4d96\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.939328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"58cc2d60-9778-460a-bd81-89c8078a4d96\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.939385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"58cc2d60-9778-460a-bd81-89c8078a4d96\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.939761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities" (OuterVolumeSpecName: "utilities") pod "58cc2d60-9778-460a-bd81-89c8078a4d96" (UID: "58cc2d60-9778-460a-bd81-89c8078a4d96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.948391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d" (OuterVolumeSpecName: "kube-api-access-dff8d") pod "58cc2d60-9778-460a-bd81-89c8078a4d96" (UID: "58cc2d60-9778-460a-bd81-89c8078a4d96"). InnerVolumeSpecName "kube-api-access-dff8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.041134 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.041349 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.076351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58cc2d60-9778-460a-bd81-89c8078a4d96" (UID: "58cc2d60-9778-460a-bd81-89c8078a4d96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.142043 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.261386 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" path="/var/lib/kubelet/pods/85b922ea-7281-44f5-b78b-b0ec5d5387d3/volumes" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.262099 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" path="/var/lib/kubelet/pods/d26689f8-7057-45ba-8d53-ae4623ecd2e9/volumes" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.604897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerStarted","Data":"dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.604941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerStarted","Data":"acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.605212 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.607400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"d58b7f5b37a6a35ef39d3d8b6ebcad0e2da7e5425eb92dcc45ec68fe40722e18"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.607437 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.607457 4795 scope.go:117] "RemoveContainer" containerID="c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.609054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerStarted","Data":"90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.609090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerStarted","Data":"291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.609323 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.613237 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.617024 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.628297 4795 scope.go:117] "RemoveContainer" containerID="ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.631232 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" podStartSLOduration=3.631217481 podStartE2EDuration="3.631217481s" podCreationTimestamp="2026-03-20 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:19.628366332 +0000 UTC m=+283.086397873" watchObservedRunningTime="2026-03-20 17:22:19.631217481 +0000 UTC m=+283.089249032" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.644615 4795 scope.go:117] "RemoveContainer" containerID="898e9ddb331a961041951b7bb1edfb2abf5db69d1009da036bfe796e8579e1e3" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.647591 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.655149 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:22:21 crc kubenswrapper[4795]: I0320 17:22:21.173217 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:21 crc kubenswrapper[4795]: I0320 17:22:21.204814 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" podStartSLOduration=5.204757741 podStartE2EDuration="5.204757741s" podCreationTimestamp="2026-03-20 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:19.684861177 +0000 UTC m=+283.142892738" watchObservedRunningTime="2026-03-20 17:22:21.204757741 +0000 UTC m=+284.662789322" Mar 20 17:22:21 crc kubenswrapper[4795]: I0320 17:22:21.264678 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" path="/var/lib/kubelet/pods/58cc2d60-9778-460a-bd81-89c8078a4d96/volumes" Mar 20 17:22:23 crc kubenswrapper[4795]: I0320 17:22:23.024623 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.478000 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.478890 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" containerID="cri-o://90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726" gracePeriod=30 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.573047 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.573791 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" containerID="cri-o://dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc" gracePeriod=30 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.738636 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerID="90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726" exitCode=0 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.738732 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerDied","Data":"90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726"} Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.741943 4795 generic.go:334] "Generic (PLEG): container finished" podID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerID="dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc" exitCode=0 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.741968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerDied","Data":"dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc"} Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.139845 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.154829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213179 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213293 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config" (OuterVolumeSpecName: "config") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config" (OuterVolumeSpecName: "config") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca" (OuterVolumeSpecName: "client-ca") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca" (OuterVolumeSpecName: "client-ca") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215297 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215334 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215359 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215384 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215410 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.218581 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm" (OuterVolumeSpecName: "kube-api-access-fh6fm") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "kube-api-access-fh6fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.218633 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x" (OuterVolumeSpecName: "kube-api-access-vtd2x") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "kube-api-access-vtd2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.219265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.225719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316166 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316419 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316431 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316440 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.750535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerDied","Data":"acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7"} Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.750595 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.750604 4795 scope.go:117] "RemoveContainer" containerID="dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.753592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerDied","Data":"291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d"} Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.753723 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.777858 4795 scope.go:117] "RemoveContainer" containerID="90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.803985 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.811365 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.818241 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.821280 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.983645 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz"] Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-content" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984206 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-content" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984312 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-utilities" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984403 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-utilities" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984484 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984553 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984625 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984712 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984814 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984908 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985110 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985201 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.986326 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.990146 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.990663 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.990869 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.991322 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.991510 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.991669 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.992456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996093 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996225 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996409 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996608 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996766 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.997323 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.001737 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz"] Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.004840 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz"] Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.031452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.131498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-config\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.131835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783215b2-064c-42d0-a523-6f4f9259526a-serving-cert\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.131986 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvh9x\" (UniqueName: \"kubernetes.io/projected/783215b2-064c-42d0-a523-6f4f9259526a-kube-api-access-zvh9x\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-client-ca\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tmg\" (UniqueName: \"kubernetes.io/projected/e9495f82-c066-4979-9707-1d0b732dc77c-kube-api-access-m7tmg\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132350 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-config\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-client-ca\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9495f82-c066-4979-9707-1d0b732dc77c-serving-cert\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-proxy-ca-bundles\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.233625 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-proxy-ca-bundles\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.233883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-config\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783215b2-064c-42d0-a523-6f4f9259526a-serving-cert\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvh9x\" (UniqueName: \"kubernetes.io/projected/783215b2-064c-42d0-a523-6f4f9259526a-kube-api-access-zvh9x\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-client-ca\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tmg\" (UniqueName: \"kubernetes.io/projected/e9495f82-c066-4979-9707-1d0b732dc77c-kube-api-access-m7tmg\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-config\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-client-ca\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234555 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9495f82-c066-4979-9707-1d0b732dc77c-serving-cert\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235196 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-config\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-client-ca\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-proxy-ca-bundles\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-client-ca\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-config\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.240462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783215b2-064c-42d0-a523-6f4f9259526a-serving-cert\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.254399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9495f82-c066-4979-9707-1d0b732dc77c-serving-cert\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.258904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvh9x\" (UniqueName: \"kubernetes.io/projected/783215b2-064c-42d0-a523-6f4f9259526a-kube-api-access-zvh9x\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.261493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tmg\" (UniqueName: \"kubernetes.io/projected/e9495f82-c066-4979-9707-1d0b732dc77c-kube-api-access-m7tmg\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.337807 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.345347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.734119 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz"] Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.765102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" event={"ID":"783215b2-064c-42d0-a523-6f4f9259526a","Type":"ContainerStarted","Data":"75b2d9dc239e36a4a908c3bd897a6107015f08e5c989407cb2efb25f3c76567b"} Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.796965 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz"] Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.259731 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" path="/var/lib/kubelet/pods/5e3c5fd8-2990-4fb9-a8e6-224463172129/volumes" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.260920 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" path="/var/lib/kubelet/pods/7bfdb898-c35d-488c-9478-4aa41570ca9e/volumes" Mar 20 17:22:39 crc kubenswrapper[4795]: E0320 17:22:39.638983 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache]" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.788283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" event={"ID":"783215b2-064c-42d0-a523-6f4f9259526a","Type":"ContainerStarted","Data":"a52553f6e36343f68d4503456fa55802236a40cab19262e24a90ce7ac65e16c2"} Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.788669 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.790539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" event={"ID":"e9495f82-c066-4979-9707-1d0b732dc77c","Type":"ContainerStarted","Data":"dc7aba5d14ae47e99c998b4f8ebff7603f1c540290bc0c22044158609de9cdd8"} Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.790791 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.790926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" event={"ID":"e9495f82-c066-4979-9707-1d0b732dc77c","Type":"ContainerStarted","Data":"267d1426526b69c78e2efd2918a303fa490695b0de8d306b9636116b0b7735ec"} Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.793054 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.795780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.803523 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" podStartSLOduration=3.803510539 podStartE2EDuration="3.803510539s" podCreationTimestamp="2026-03-20 17:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:39.80225323 +0000 UTC m=+303.260284801" watchObservedRunningTime="2026-03-20 17:22:39.803510539 +0000 UTC m=+303.261542080" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.820635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" podStartSLOduration=3.820619711 podStartE2EDuration="3.820619711s" podCreationTimestamp="2026-03-20 17:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:39.819240758 +0000 UTC m=+303.277272339" watchObservedRunningTime="2026-03-20 17:22:39.820619711 +0000 UTC m=+303.278651252" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.068100 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.069539 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.070417 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071062 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071131 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071144 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071178 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071157 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.072667 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.072932 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.072952 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.072980 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.072993 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073011 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073023 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073038 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073050 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073064 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073077 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073107 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073123 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073134 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073149 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073160 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073176 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073187 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073201 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073214 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073421 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073444 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073462 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073534 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073552 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073912 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073932 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.096410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.098014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.098660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100357 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100402 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.136189 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.201834 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202174 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.201992 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202421 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.251847 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.252877 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253198 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253377 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253543 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.253561 4795 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253769 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="200ms" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.423010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: W0320 17:22:42.443201 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f WatchSource:0}: Error finding container c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f: Status 404 returned error can't find the container with id c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.446183 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c7ca864bab1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,LastTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.454639 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="400ms" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.822168 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874"} Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.822578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f"} Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.823214 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.823508 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.825927 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.827788 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.828966 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829157 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829303 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829431 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" exitCode=2 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829052 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.832181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerDied","Data":"30207c849aa57355d0d1027a4dded499a63ab8fad8a6d8162ced0752bd75a382"} Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.832195 4795 generic.go:334] "Generic (PLEG): container finished" podID="49054187-cb30-4f07-b67a-794c2503f50a" containerID="30207c849aa57355d0d1027a4dded499a63ab8fad8a6d8162ced0752bd75a382" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.833225 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.834024 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.834575 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.856052 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="800ms" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.972444 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.972538 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 17:22:43 crc kubenswrapper[4795]: E0320 17:22:43.066007 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c7ca864bab1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,LastTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:22:43 crc kubenswrapper[4795]: E0320 17:22:43.657636 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="1.6s" Mar 20 17:22:43 crc kubenswrapper[4795]: I0320 17:22:43.844245 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.338532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.339300 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.339785 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"49054187-cb30-4f07-b67a-794c2503f50a\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"49054187-cb30-4f07-b67a-794c2503f50a\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"49054187-cb30-4f07-b67a-794c2503f50a\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock" (OuterVolumeSpecName: "var-lock") pod "49054187-cb30-4f07-b67a-794c2503f50a" (UID: "49054187-cb30-4f07-b67a-794c2503f50a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437729 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "49054187-cb30-4f07-b67a-794c2503f50a" (UID: "49054187-cb30-4f07-b67a-794c2503f50a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437919 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437934 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.442053 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.443325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.443885 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.444292 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.444644 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.445197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "49054187-cb30-4f07-b67a-794c2503f50a" (UID: "49054187-cb30-4f07-b67a-794c2503f50a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539064 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539404 4795 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539431 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539451 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539469 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.856210 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.857749 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" exitCode=0 Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.857814 4795 scope.go:117] "RemoveContainer" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.857917 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.861466 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerDied","Data":"bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa"} Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.861522 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.861535 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.874934 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.875572 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.876290 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.887906 4795 scope.go:117] "RemoveContainer" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.890592 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.890965 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.891314 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.913205 4795 scope.go:117] "RemoveContainer" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.932849 4795 scope.go:117] "RemoveContainer" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.956499 4795 scope.go:117] "RemoveContainer" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.980521 4795 scope.go:117] "RemoveContainer" containerID="357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.005992 4795 scope.go:117] "RemoveContainer" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.006350 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\": container with ID starting with 554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b not found: ID does not exist" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006382 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b"} err="failed to get container status \"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\": rpc error: code = NotFound desc = could not find container \"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\": container with ID starting with 554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006402 4795 scope.go:117] "RemoveContainer" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.006621 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\": container with ID starting with 730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96 not found: ID does not exist" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006644 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96"} err="failed to get container status \"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\": rpc error: code = NotFound desc = could not find container \"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\": container with ID starting with 730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006660 4795 scope.go:117] "RemoveContainer" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.006917 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\": container with ID starting with 876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94 not found: ID does not exist" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006965 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94"} err="failed to get container status \"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\": rpc error: code = NotFound desc = could not find container \"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\": container with ID starting with 876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006979 4795 scope.go:117] "RemoveContainer" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.008241 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\": container with ID starting with 7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018 not found: ID does not exist" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008263 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018"} err="failed to get container status \"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\": rpc error: code = NotFound desc = could not find container \"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\": container with ID starting with 7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008278 4795 scope.go:117] "RemoveContainer" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.008611 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\": container with ID starting with f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080 not found: ID does not exist" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008636 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080"} err="failed to get container status \"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\": rpc error: code = NotFound desc = could not find container \"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\": container with ID starting with f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008649 4795 scope.go:117] "RemoveContainer" containerID="357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.008919 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\": container with ID starting with 357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782 not found: ID does not exist" containerID="357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008948 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782"} err="failed to get container status \"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\": rpc error: code = NotFound desc = could not find container \"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\": container with ID starting with 357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.258183 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="3.2s" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.262289 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 17:22:47 crc kubenswrapper[4795]: I0320 17:22:47.257608 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:47 crc kubenswrapper[4795]: I0320 17:22:47.258261 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.060250 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" containerID="cri-o://5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" gracePeriod=15 Mar 20 17:22:48 crc kubenswrapper[4795]: E0320 17:22:48.459242 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="6.4s" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.601940 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.602665 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.603202 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.604003 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708259 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708334 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708578 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.709585 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708678 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.709937 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.709991 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710108 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710192 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710249 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711841 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711926 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711952 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711973 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.713530 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.718459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.719217 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.719424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs" (OuterVolumeSpecName: "kube-api-access-fdtvs") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "kube-api-access-fdtvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.719944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.720218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.723097 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.724572 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.725093 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.727159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812469 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812504 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812514 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812523 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812532 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812541 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812552 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812563 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812571 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812580 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.890468 4795 generic.go:334] "Generic (PLEG): container finished" podID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" exitCode=0 Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.890540 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.890534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerDied","Data":"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f"} Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.891105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerDied","Data":"54c16e287e6b044067d81a5f122f5fce8bd8b850064a731beff318d152b5a0e9"} Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.891130 4795 scope.go:117] "RemoveContainer" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.892545 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.893012 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.893330 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.913808 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.914190 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.914548 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.925084 4795 scope.go:117] "RemoveContainer" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" Mar 20 17:22:48 crc kubenswrapper[4795]: E0320 17:22:48.925550 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f\": container with ID starting with 5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f not found: ID does not exist" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.925591 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f"} err="failed to get container status \"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f\": rpc error: code = NotFound desc = could not find container \"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f\": container with ID starting with 5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f not found: ID does not exist" Mar 20 17:22:49 crc kubenswrapper[4795]: E0320 17:22:49.783302 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache]" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.251500 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.253809 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.254531 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.255166 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.270924 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.270979 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: E0320 17:22:52.271564 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.272179 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: W0320 17:22:52.305905 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d WatchSource:0}: Error finding container f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d: Status 404 returned error can't find the container with id f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.936753 4795 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="12ddd5d234fe00d0e7d99a02a214fba89126a9e4151814b42f99988eaca28de9" exitCode=0 Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.936856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"12ddd5d234fe00d0e7d99a02a214fba89126a9e4151814b42f99988eaca28de9"} Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.937182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d"} Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.937569 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.937601 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: E0320 17:22:52.938152 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.938175 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.938615 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.939061 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:53 crc kubenswrapper[4795]: E0320 17:22:53.067136 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c7ca864bab1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,LastTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:22:53 crc kubenswrapper[4795]: I0320 17:22:53.945387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9a2915b230565006626461f0115c0347a53283c27adb636bedd05c3e333e802"} Mar 20 17:22:53 crc kubenswrapper[4795]: I0320 17:22:53.945736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f9affc3d86a0fa9b02e5575739abc74afe9da1997cd31dd1261239364f081bf5"} Mar 20 17:22:53 crc kubenswrapper[4795]: I0320 17:22:53.945746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72f892532c7c5996be453821f507a0511c3cff77c37b060440661f4b51418fa2"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.955846 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.956751 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.956830 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950" exitCode=1 Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.956939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.957604 4795 scope.go:117] "RemoveContainer" containerID="dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d379816cc9a978f0a780f650e77d1ba0087c3e10f99d70a11a2f9f9c3aeea2c3"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bbf2d4f28a6b1053fd498390c982e3dbef480407ed5111fadfd90a0f54641c1b"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964555 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964584 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964608 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.745076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.975059 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.976207 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.976294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520"} Mar 20 17:22:57 crc kubenswrapper[4795]: I0320 17:22:57.272456 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:57 crc kubenswrapper[4795]: I0320 17:22:57.272521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:57 crc kubenswrapper[4795]: I0320 17:22:57.281327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:58 crc kubenswrapper[4795]: I0320 17:22:58.385532 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:22:58 crc kubenswrapper[4795]: I0320 17:22:58.386000 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:22:58 crc kubenswrapper[4795]: I0320 17:22:58.386487 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:22:59 crc kubenswrapper[4795]: E0320 17:22:59.923637 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:22:59 crc kubenswrapper[4795]: I0320 17:22:59.974074 4795 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.009600 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.009658 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.018788 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.117189 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3f481f95-4236-49c0-a819-66b5416cb925" Mar 20 17:23:01 crc kubenswrapper[4795]: I0320 17:23:01.015768 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:01 crc kubenswrapper[4795]: I0320 17:23:01.015826 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:01 crc kubenswrapper[4795]: I0320 17:23:01.018902 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3f481f95-4236-49c0-a819-66b5416cb925" Mar 20 17:23:02 crc kubenswrapper[4795]: I0320 17:23:02.956485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.226517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.226973 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.227130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.230168 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.230203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.230940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.239132 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.239318 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.253356 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.253944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.330833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.337029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.375980 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.391316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.404170 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:04 crc kubenswrapper[4795]: W0320 17:23:04.932058 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f WatchSource:0}: Error finding container f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f: Status 404 returned error can't find the container with id f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f Mar 20 17:23:05 crc kubenswrapper[4795]: W0320 17:23:05.007451 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9 WatchSource:0}: Error finding container 67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9: Status 404 returned error can't find the container with id 67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9 Mar 20 17:23:05 crc kubenswrapper[4795]: I0320 17:23:05.050551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9"} Mar 20 17:23:05 crc kubenswrapper[4795]: I0320 17:23:05.052450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f"} Mar 20 17:23:05 crc kubenswrapper[4795]: W0320 17:23:05.071208 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f WatchSource:0}: Error finding container d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f: Status 404 returned error can't find the container with id d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.047894 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.064124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6771992b61542ed1ebd988c425bef3156fac5bdb7fe3d5365ed4ee0e5a60b3c7"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.064194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.064404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.066970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.070736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ba02a145c138c54849e78cb97f82c698845800b8de0805be5c6d93c948a71de9"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.224269 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.666878 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.667129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.733285 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.975904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.079803 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.079885 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad" exitCode=255 Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.079990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad"} Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.080590 4795 scope.go:117] "RemoveContainer" containerID="231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.597378 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.647291 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.687749 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.924715 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.942033 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.054065 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.087123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.087182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707"} Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.188259 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.217388 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.254423 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.281349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.284643 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.299708 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.378780 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.386862 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.386960 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.544195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.641562 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.778452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.007834 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.089895 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.095283 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.095998 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096074 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" exitCode=255 Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707"} Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096203 4795 scope.go:117] "RemoveContainer" containerID="231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096744 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:09 crc kubenswrapper[4795]: E0320 17:23:09.097165 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.206494 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.240156 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.441805 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.482670 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.634974 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.706975 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.863540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:23:10 crc kubenswrapper[4795]: E0320 17:23:10.073777 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.074576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.105266 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.106247 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:10 crc kubenswrapper[4795]: E0320 17:23:10.106590 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.157885 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.161943 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=28.161917482 podStartE2EDuration="28.161917482s" podCreationTimestamp="2026-03-20 17:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:23:00.037671249 +0000 UTC m=+323.495702790" watchObservedRunningTime="2026-03-20 17:23:10.161917482 +0000 UTC m=+333.619949073" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.166042 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.166196 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.173261 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.173905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.194850 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.19482764 podStartE2EDuration="11.19482764s" podCreationTimestamp="2026-03-20 17:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:23:10.189490619 +0000 UTC m=+333.647522230" watchObservedRunningTime="2026-03-20 17:23:10.19482764 +0000 UTC m=+333.652859211" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.269765 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.516586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.526513 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.638568 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.656120 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.662514 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.006212 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.006561 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874" gracePeriod=5 Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.265786 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" path="/var/lib/kubelet/pods/74d8b767-93df-4c96-a7f0-e7e84ba99380/volumes" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.294213 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.642562 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.661865 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.681523 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.765248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.841277 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:23:12 crc kubenswrapper[4795]: I0320 17:23:12.322644 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:23:12 crc kubenswrapper[4795]: I0320 17:23:12.645905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:23:12 crc kubenswrapper[4795]: I0320 17:23:12.773911 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:23:13 crc kubenswrapper[4795]: I0320 17:23:13.532045 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:23:13 crc kubenswrapper[4795]: I0320 17:23:13.656751 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.100077 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.170537 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.289224 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.392624 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.437618 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.669869 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.905399 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.128631 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.209482 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.282186 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.308881 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.361576 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.455597 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.470555 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.508032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.599839 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.656422 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.712151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.782912 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.076640 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.129757 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.150835 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.150931 4795 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874" exitCode=137 Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.209362 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.257156 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.597223 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.628168 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.628675 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.698894 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801941 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802401 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802432 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802452 4795 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802472 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.817452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.903292 4795 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.965989 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.161344 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.161475 4795 scope.go:117] "RemoveContainer" containerID="02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.161773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.188479 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.258634 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.264523 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.265019 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.285302 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.285353 4795 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a3302709-36e6-471a-b69d-33af908a64cd" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.293544 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.294226 4795 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a3302709-36e6-471a-b69d-33af908a64cd" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.294492 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.315675 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.333429 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.402313 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.414382 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.437254 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.470929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.699810 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.735314 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.747723 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.799440 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.849174 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.986353 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.017466 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.057288 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.068939 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.077766 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.084974 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.094294 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.387124 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.387218 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.387294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.388272 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.388470 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520" gracePeriod=30 Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.413366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.421562 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.494141 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.515587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.583613 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.642335 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.667345 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.735363 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.757929 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.800362 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.861822 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.902530 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.953524 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.012851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.027468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.139756 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.547818 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.649173 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.649543 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.868432 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.919434 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.981973 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.102752 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.114920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:23:20 crc kubenswrapper[4795]: E0320 17:23:20.244690 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache]" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.288139 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.365256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.458523 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.525357 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.531492 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.540772 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.666562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.690411 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.714116 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.715297 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.821918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.015380 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.082434 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.177412 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.195660 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.259007 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.297296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.297645 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.586072 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.751562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.769946 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.779585 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.827174 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.047218 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.124018 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.214614 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.254049 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.303056 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.310205 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.415876 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.656851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.657742 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.669294 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.680329 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.712809 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.793491 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.807804 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.965314 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.061738 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.066571 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.097561 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.239274 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.252469 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.332501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.338773 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.344114 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.365722 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.449154 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.474252 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.504275 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.564349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.579854 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.581119 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.728265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.760045 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.827648 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.922597 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.023370 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.090954 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.217587 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.217672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201"} Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.280934 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.350028 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.440346 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.540724 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.585002 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.619578 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.672263 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.724513 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.760711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.857630 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.913573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.951829 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.959871 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.081863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228005 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228720 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228799 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" exitCode=255 Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201"} Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228926 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.230458 4795 scope.go:117] "RemoveContainer" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" Mar 20 17:23:25 crc kubenswrapper[4795]: E0320 17:23:25.230834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.270195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.320739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.342448 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.437161 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.466670 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.595359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.633424 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.671932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.722494 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.743421 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.836990 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.908282 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.931313 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.006040 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.235826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.238277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.253295 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.397518 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.525625 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.725359 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.763200 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.129612 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.290726 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75d6949b4b-gmjkq"] Mar 20 17:23:27 crc kubenswrapper[4795]: E0320 17:23:27.291114 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291143 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:23:27 crc kubenswrapper[4795]: E0320 17:23:27.291181 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49054187-cb30-4f07-b67a-794c2503f50a" containerName="installer" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291197 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="49054187-cb30-4f07-b67a-794c2503f50a" containerName="installer" Mar 20 17:23:27 crc kubenswrapper[4795]: E0320 17:23:27.291230 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291247 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291458 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291493 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291521 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="49054187-cb30-4f07-b67a-794c2503f50a" containerName="installer" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.292296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.296228 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.296264 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.296517 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.297126 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.297196 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.297503 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.298037 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.298762 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.300848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.301075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.303564 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.303817 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.309375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.317435 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75d6949b4b-gmjkq"] Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.320243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.322273 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.325548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.329157 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364594 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364655 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-login\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-error\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-audit-policies\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365111 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dc2\" (UniqueName: \"kubernetes.io/projected/85ae2267-12da-42ef-8382-75d6aa39b954-kube-api-access-t6dc2\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85ae2267-12da-42ef-8382-75d6aa39b954-audit-dir\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-session\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.377851 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466384 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-audit-policies\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466461 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dc2\" (UniqueName: \"kubernetes.io/projected/85ae2267-12da-42ef-8382-75d6aa39b954-kube-api-access-t6dc2\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466556 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85ae2267-12da-42ef-8382-75d6aa39b954-audit-dir\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466629 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-session\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-login\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-error\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.467645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85ae2267-12da-42ef-8382-75d6aa39b954-audit-dir\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.468056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.468430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-audit-policies\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.469021 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.469560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.474673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.474761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-error\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.476141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.476855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-login\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.476885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-session\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.477027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.477376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.478217 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.495969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dc2\" (UniqueName: \"kubernetes.io/projected/85ae2267-12da-42ef-8382-75d6aa39b954-kube-api-access-t6dc2\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.549544 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.625919 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.930152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.106341 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.115512 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75d6949b4b-gmjkq"] Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.254376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" event={"ID":"85ae2267-12da-42ef-8382-75d6aa39b954","Type":"ContainerStarted","Data":"c56de0fe70fac295a9491c34522e6f93fe547c373e6e173289b0df1ab2d61004"} Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.324735 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.434075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.590360 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.006682 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.059455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.263216 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.263264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" event={"ID":"85ae2267-12da-42ef-8382-75d6aa39b954","Type":"ContainerStarted","Data":"b07f42b1bf447b1e61806e312444c517c0c6f0dcde6496eb41887f0ceaaae57e"} Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.268183 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.323803 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" podStartSLOduration=66.323781343 podStartE2EDuration="1m6.323781343s" podCreationTimestamp="2026-03-20 17:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:23:29.292059063 +0000 UTC m=+352.750090604" watchObservedRunningTime="2026-03-20 17:23:29.323781343 +0000 UTC m=+352.781812884" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.342200 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.465149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.683139 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.745410 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.849994 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.896517 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:23:30 crc kubenswrapper[4795]: E0320 17:23:30.399809 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:23:30 crc kubenswrapper[4795]: I0320 17:23:30.998426 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:23:37 crc kubenswrapper[4795]: E0320 17:23:37.286192 4795 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/a28636b1ecd20bb4083afbd5ef8fb21bddc7f459d75a0207e71de4fce8d42ee9/diff" to get inode usage: stat /var/lib/containers/storage/overlay/a28636b1ecd20bb4083afbd5ef8fb21bddc7f459d75a0207e71de4fce8d42ee9/diff: no such file or directory, extraDiskErr: Mar 20 17:23:37 crc kubenswrapper[4795]: E0320 17:23:37.286355 4795 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/2873f9ef1dc8861ab4921f68e7ee5d71811d6d581545e885e23b8f85cb0879d5/diff" to get inode usage: stat /var/lib/containers/storage/overlay/2873f9ef1dc8861ab4921f68e7ee5d71811d6d581545e885e23b8f85cb0879d5/diff: no such file or directory, extraDiskErr: Mar 20 17:23:39 crc kubenswrapper[4795]: I0320 17:23:39.253018 4795 scope.go:117] "RemoveContainer" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" Mar 20 17:23:39 crc kubenswrapper[4795]: E0320 17:23:39.253645 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:44 crc kubenswrapper[4795]: I0320 17:23:44.410733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.396561 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.399862 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400880 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520" exitCode=137 Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520"} Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f959cd95b7529d39a7d10c18164d045af09efa73e7189e024e528bd75f24eb45"} Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400990 4795 scope.go:117] "RemoveContainer" containerID="dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950" Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.409012 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" exitCode=0 Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.409073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerDied","Data":"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe"} Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.410372 4795 scope.go:117] "RemoveContainer" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.416865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.420783 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:23:51 crc kubenswrapper[4795]: I0320 17:23:51.429417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerStarted","Data":"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f"} Mar 20 17:23:51 crc kubenswrapper[4795]: I0320 17:23:51.430739 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:23:51 crc kubenswrapper[4795]: I0320 17:23:51.434515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.252007 4795 scope.go:117] "RemoveContainer" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.451536 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.452254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b2d95a75f92321d727c05b3f6ba6a7835679bffdbaf8404b31b93e6e5deb82aa"} Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.955578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:58 crc kubenswrapper[4795]: I0320 17:23:58.386600 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:58 crc kubenswrapper[4795]: I0320 17:23:58.394782 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:24:02 crc kubenswrapper[4795]: I0320 17:24:02.966771 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.047978 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.049122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.050592 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.050905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.051078 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.068607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.182203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"auto-csr-approver-29567124-wjlwc\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.283734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"auto-csr-approver-29567124-wjlwc\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.303878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"auto-csr-approver-29567124-wjlwc\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.363379 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.829246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:24:10 crc kubenswrapper[4795]: I0320 17:24:10.592861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" event={"ID":"35c14395-0a4c-47be-8f64-382e60e3faad","Type":"ContainerStarted","Data":"152feee1511bbc3ac328d2740d0920b24a83751d1d4c0a4e97f0c8bed6a7d401"} Mar 20 17:24:11 crc kubenswrapper[4795]: I0320 17:24:11.600432 4795 generic.go:334] "Generic (PLEG): container finished" podID="35c14395-0a4c-47be-8f64-382e60e3faad" containerID="e208a8a62ce5332bce059cfe9498a63b10989e2ede473bf8237789de0f3da7f0" exitCode=0 Mar 20 17:24:11 crc kubenswrapper[4795]: I0320 17:24:11.600488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" event={"ID":"35c14395-0a4c-47be-8f64-382e60e3faad","Type":"ContainerDied","Data":"e208a8a62ce5332bce059cfe9498a63b10989e2ede473bf8237789de0f3da7f0"} Mar 20 17:24:12 crc kubenswrapper[4795]: I0320 17:24:12.958630 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.032148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"35c14395-0a4c-47be-8f64-382e60e3faad\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.039803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4" (OuterVolumeSpecName: "kube-api-access-x59l4") pod "35c14395-0a4c-47be-8f64-382e60e3faad" (UID: "35c14395-0a4c-47be-8f64-382e60e3faad"). InnerVolumeSpecName "kube-api-access-x59l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.133568 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") on node \"crc\" DevicePath \"\"" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.618514 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" event={"ID":"35c14395-0a4c-47be-8f64-382e60e3faad","Type":"ContainerDied","Data":"152feee1511bbc3ac328d2740d0920b24a83751d1d4c0a4e97f0c8bed6a7d401"} Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.618956 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152feee1511bbc3ac328d2740d0920b24a83751d1d4c0a4e97f0c8bed6a7d401" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.618588 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:41 crc kubenswrapper[4795]: I0320 17:24:41.300447 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:24:41 crc kubenswrapper[4795]: I0320 17:24:41.301822 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.230359 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.231063 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kzvch" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" containerID="cri-o://c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.248465 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.248837 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kk5rk" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" containerID="cri-o://a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.255915 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.256864 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" containerID="cri-o://b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.274499 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.274884 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ht4zv" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" containerID="cri-o://1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.283596 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8x76m"] Mar 20 17:25:00 crc kubenswrapper[4795]: E0320 17:25:00.283966 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" containerName="oc" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.283986 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" containerName="oc" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.284189 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" containerName="oc" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.284800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.290516 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.290815 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q7czt" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" containerID="cri-o://5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.298059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8x76m"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.379788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.379833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tntxg\" (UniqueName: \"kubernetes.io/projected/a2de2777-57e1-4310-a878-1cfc1fc77e44-kube-api-access-tntxg\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.379876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.480767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.480821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tntxg\" (UniqueName: \"kubernetes.io/projected/a2de2777-57e1-4310-a878-1cfc1fc77e44-kube-api-access-tntxg\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.480868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.482697 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.493732 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.499232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tntxg\" (UniqueName: \"kubernetes.io/projected/a2de2777-57e1-4310-a878-1cfc1fc77e44-kube-api-access-tntxg\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.547968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8jcpg"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.548549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.578044 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8jcpg"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.600591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.639286 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690838 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d9597a8-43b0-4f3e-adb9-5f0d32479431-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-tls\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-trusted-ca\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-bound-sa-token\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5c2p\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-kube-api-access-n5c2p\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.691023 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d9597a8-43b0-4f3e-adb9-5f0d32479431-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.691045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-certificates\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.695024 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.703258 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.751277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"57849322-f280-42ee-a330-18120aeed5db\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807255 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"ed1a790f-ddf0-4512-88c5-dba972460e8a\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807323 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"57849322-f280-42ee-a330-18120aeed5db\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"ed1a790f-ddf0-4512-88c5-dba972460e8a\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"57849322-f280-42ee-a330-18120aeed5db\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"ed1a790f-ddf0-4512-88c5-dba972460e8a\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5c2p\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-kube-api-access-n5c2p\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d9597a8-43b0-4f3e-adb9-5f0d32479431-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808077 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-certificates\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d9597a8-43b0-4f3e-adb9-5f0d32479431-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-tls\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-trusted-ca\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808194 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-bound-sa-token\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.809761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities" (OuterVolumeSpecName: "utilities") pod "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" (UID: "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.810954 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.811269 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-certificates\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.811900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ed1a790f-ddf0-4512-88c5-dba972460e8a" (UID: "ed1a790f-ddf0-4512-88c5-dba972460e8a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.812165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-trusted-ca\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.814351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d9597a8-43b0-4f3e-adb9-5f0d32479431-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.814404 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities" (OuterVolumeSpecName: "utilities") pod "57849322-f280-42ee-a330-18120aeed5db" (UID: "57849322-f280-42ee-a330-18120aeed5db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.831989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d9597a8-43b0-4f3e-adb9-5f0d32479431-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.834782 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk" (OuterVolumeSpecName: "kube-api-access-4cdbk") pod "ed1a790f-ddf0-4512-88c5-dba972460e8a" (UID: "ed1a790f-ddf0-4512-88c5-dba972460e8a"). InnerVolumeSpecName "kube-api-access-4cdbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.841573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ed1a790f-ddf0-4512-88c5-dba972460e8a" (UID: "ed1a790f-ddf0-4512-88c5-dba972460e8a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc" (OuterVolumeSpecName: "kube-api-access-w2vjc") pod "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" (UID: "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f"). InnerVolumeSpecName "kube-api-access-w2vjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-tls\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-bound-sa-token\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd" (OuterVolumeSpecName: "kube-api-access-x6zsd") pod "57849322-f280-42ee-a330-18120aeed5db" (UID: "57849322-f280-42ee-a330-18120aeed5db"). InnerVolumeSpecName "kube-api-access-x6zsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5c2p\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-kube-api-access-n5c2p\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.873394 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.896375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57849322-f280-42ee-a330-18120aeed5db" (UID: "57849322-f280-42ee-a330-18120aeed5db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.904697 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" (UID: "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"70000016-e928-4b11-a31d-4d08e9450a1c\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"70000016-e928-4b11-a31d-4d08e9450a1c\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"70000016-e928-4b11-a31d-4d08e9450a1c\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908906 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908919 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908927 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908936 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908944 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908952 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908961 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908968 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908977 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.909324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities" (OuterVolumeSpecName: "utilities") pod "70000016-e928-4b11-a31d-4d08e9450a1c" (UID: "70000016-e928-4b11-a31d-4d08e9450a1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.912180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8" (OuterVolumeSpecName: "kube-api-access-sgvm8") pod "70000016-e928-4b11-a31d-4d08e9450a1c" (UID: "70000016-e928-4b11-a31d-4d08e9450a1c"). InnerVolumeSpecName "kube-api-access-sgvm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926520 4795 generic.go:334] "Generic (PLEG): container finished" podID="57849322-f280-42ee-a330-18120aeed5db" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926594 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926609 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"333adeb9b81abd47208fc6ec71e454bad1f18be9356efa101b49dd2d5983cc19"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926630 4795 scope.go:117] "RemoveContainer" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931494 4795 generic.go:334] "Generic (PLEG): container finished" podID="70000016-e928-4b11-a31d-4d08e9450a1c" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931625 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.932441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70000016-e928-4b11-a31d-4d08e9450a1c" (UID: "70000016-e928-4b11-a31d-4d08e9450a1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933804 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerDied","Data":"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerDied","Data":"10ac9aefe8ac1466c7fac8993e74ddbafb9c6821332b48f3d05657ff9290f6e5"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933945 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944190 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"352f21e959b8a9617f62fdaa474337c620b65ea35de203e2a6258d4f6ab66557"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944281 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946491 4795 generic.go:334] "Generic (PLEG): container finished" podID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946615 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946809 4795 scope.go:117] "RemoveContainer" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.961257 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.966958 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.970434 4795 scope.go:117] "RemoveContainer" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.984972 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.991878 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.995030 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.001876 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.008461 4795 scope.go:117] "RemoveContainer" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.008979 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233\": container with ID starting with a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233 not found: ID does not exist" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009006 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233"} err="failed to get container status \"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233\": rpc error: code = NotFound desc = could not find container \"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233\": container with ID starting with a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009027 4795 scope.go:117] "RemoveContainer" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.009374 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f\": container with ID starting with 36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f not found: ID does not exist" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009393 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f"} err="failed to get container status \"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f\": rpc error: code = NotFound desc = could not find container \"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f\": container with ID starting with 36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009405 4795 scope.go:117] "RemoveContainer" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009558 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.009671 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1\": container with ID starting with cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1 not found: ID does not exist" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009880 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1"} err="failed to get container status \"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1\": rpc error: code = NotFound desc = could not find container \"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1\": container with ID starting with cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009925 4795 scope.go:117] "RemoveContainer" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010459 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010480 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010508 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities" (OuterVolumeSpecName: "utilities") pod "73dd05f7-2cc4-4a99-b12d-26e4d436acca" (UID: "73dd05f7-2cc4-4a99-b12d-26e4d436acca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.012829 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb" (OuterVolumeSpecName: "kube-api-access-rhxfb") pod "73dd05f7-2cc4-4a99-b12d-26e4d436acca" (UID: "73dd05f7-2cc4-4a99-b12d-26e4d436acca"). InnerVolumeSpecName "kube-api-access-rhxfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.024381 4795 scope.go:117] "RemoveContainer" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.035721 4795 scope.go:117] "RemoveContainer" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.063558 4795 scope.go:117] "RemoveContainer" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.065598 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d\": container with ID starting with 1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d not found: ID does not exist" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.065633 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d"} err="failed to get container status \"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d\": rpc error: code = NotFound desc = could not find container \"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d\": container with ID starting with 1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.065654 4795 scope.go:117] "RemoveContainer" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.066036 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e\": container with ID starting with 6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e not found: ID does not exist" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066051 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e"} err="failed to get container status \"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e\": rpc error: code = NotFound desc = could not find container \"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e\": container with ID starting with 6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066064 4795 scope.go:117] "RemoveContainer" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.066421 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25\": container with ID starting with 9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25 not found: ID does not exist" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066484 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25"} err="failed to get container status \"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25\": rpc error: code = NotFound desc = could not find container \"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25\": container with ID starting with 9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066519 4795 scope.go:117] "RemoveContainer" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.083368 4795 scope.go:117] "RemoveContainer" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.098852 4795 scope.go:117] "RemoveContainer" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.099236 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f\": container with ID starting with b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f not found: ID does not exist" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099267 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f"} err="failed to get container status \"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f\": rpc error: code = NotFound desc = could not find container \"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f\": container with ID starting with b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099288 4795 scope.go:117] "RemoveContainer" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.099581 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe\": container with ID starting with c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe not found: ID does not exist" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099626 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe"} err="failed to get container status \"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe\": rpc error: code = NotFound desc = could not find container \"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe\": container with ID starting with c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099659 4795 scope.go:117] "RemoveContainer" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.112575 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.112600 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.123822 4795 scope.go:117] "RemoveContainer" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.161824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8x76m"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.162847 4795 scope.go:117] "RemoveContainer" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" Mar 20 17:25:01 crc kubenswrapper[4795]: W0320 17:25:01.174274 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2de2777_57e1_4310_a878_1cfc1fc77e44.slice/crio-f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e WatchSource:0}: Error finding container f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e: Status 404 returned error can't find the container with id f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.180942 4795 scope.go:117] "RemoveContainer" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.182616 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1\": container with ID starting with c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1 not found: ID does not exist" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.182653 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1"} err="failed to get container status \"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1\": rpc error: code = NotFound desc = could not find container \"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1\": container with ID starting with c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.182703 4795 scope.go:117] "RemoveContainer" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.183497 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741\": container with ID starting with 824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741 not found: ID does not exist" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.183546 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741"} err="failed to get container status \"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741\": rpc error: code = NotFound desc = could not find container \"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741\": container with ID starting with 824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.183581 4795 scope.go:117] "RemoveContainer" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.183968 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031\": container with ID starting with 63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031 not found: ID does not exist" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.183992 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031"} err="failed to get container status \"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031\": rpc error: code = NotFound desc = could not find container \"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031\": container with ID starting with 63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.184007 4795 scope.go:117] "RemoveContainer" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.185656 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8jcpg"] Mar 20 17:25:01 crc kubenswrapper[4795]: W0320 17:25:01.188831 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9597a8_43b0_4f3e_adb9_5f0d32479431.slice/crio-5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c WatchSource:0}: Error finding container 5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c: Status 404 returned error can't find the container with id 5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.193394 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73dd05f7-2cc4-4a99-b12d-26e4d436acca" (UID: "73dd05f7-2cc4-4a99-b12d-26e4d436acca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.195945 4795 scope.go:117] "RemoveContainer" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.211953 4795 scope.go:117] "RemoveContainer" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.213280 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.227281 4795 scope.go:117] "RemoveContainer" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.227952 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c\": container with ID starting with 5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c not found: ID does not exist" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228004 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c"} err="failed to get container status \"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c\": rpc error: code = NotFound desc = could not find container \"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c\": container with ID starting with 5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228037 4795 scope.go:117] "RemoveContainer" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.228392 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa\": container with ID starting with 6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa not found: ID does not exist" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228432 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa"} err="failed to get container status \"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa\": rpc error: code = NotFound desc = could not find container \"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa\": container with ID starting with 6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228459 4795 scope.go:117] "RemoveContainer" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.228797 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394\": container with ID starting with 6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394 not found: ID does not exist" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228817 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394"} err="failed to get container status \"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394\": rpc error: code = NotFound desc = could not find container \"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394\": container with ID starting with 6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.269730 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57849322-f280-42ee-a330-18120aeed5db" path="/var/lib/kubelet/pods/57849322-f280-42ee-a330-18120aeed5db/volumes" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.272319 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" path="/var/lib/kubelet/pods/ed1a790f-ddf0-4512-88c5-dba972460e8a/volumes" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.274485 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" path="/var/lib/kubelet/pods/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f/volumes" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.282425 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.282460 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.282478 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.285255 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.956994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" event={"ID":"a2de2777-57e1-4310-a878-1cfc1fc77e44","Type":"ContainerStarted","Data":"4982d80de10a2ffa83b3a05d649ed0a52a93c8fee37931a38e37520f7e0db035"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.957248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" event={"ID":"a2de2777-57e1-4310-a878-1cfc1fc77e44","Type":"ContainerStarted","Data":"f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.958183 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.959820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" event={"ID":"4d9597a8-43b0-4f3e-adb9-5f0d32479431","Type":"ContainerStarted","Data":"a06ed4cb968c32d980ff6898fdf2fbbb3c2fdae55a9d73f63a4279b52e120cd0"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.960058 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.960210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" event={"ID":"4d9597a8-43b0-4f3e-adb9-5f0d32479431","Type":"ContainerStarted","Data":"5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.964345 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.000707 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" podStartSLOduration=2.000654018 podStartE2EDuration="2.000654018s" podCreationTimestamp="2026-03-20 17:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:25:02.000112811 +0000 UTC m=+445.458144422" watchObservedRunningTime="2026-03-20 17:25:02.000654018 +0000 UTC m=+445.458685599" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.005090 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" podStartSLOduration=2.00507188 podStartE2EDuration="2.00507188s" podCreationTimestamp="2026-03-20 17:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:25:01.977504057 +0000 UTC m=+445.435535618" watchObservedRunningTime="2026-03-20 17:25:02.00507188 +0000 UTC m=+445.463103461" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.448671 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nwm6j"] Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449018 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449045 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449052 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449060 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449067 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449075 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449083 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449102 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449113 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449120 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449130 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449147 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449154 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449167 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449176 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449184 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449190 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449200 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449208 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449224 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449234 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449347 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449361 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449374 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449384 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449396 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449409 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449519 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449529 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.450248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.453033 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.462854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwm6j"] Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.533323 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r4h\" (UniqueName: \"kubernetes.io/projected/38a67438-04e3-433b-9b32-47acf98b3086-kube-api-access-j7r4h\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.533445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-utilities\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.533493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-catalog-content\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.634634 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r4h\" (UniqueName: \"kubernetes.io/projected/38a67438-04e3-433b-9b32-47acf98b3086-kube-api-access-j7r4h\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.634856 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-utilities\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.634982 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-catalog-content\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.635413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-catalog-content\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.635750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-utilities\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.656655 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.673595 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.674134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r4h\" (UniqueName: \"kubernetes.io/projected/38a67438-04e3-433b-9b32-47acf98b3086-kube-api-access-j7r4h\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.676920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.683056 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.736648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.736748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.736797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.777503 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.837835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.837958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.838023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.838221 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.838519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.858455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.012114 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwm6j"] Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.029121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.258738 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" path="/var/lib/kubelet/pods/70000016-e928-4b11-a31d-4d08e9450a1c/volumes" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.259842 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" path="/var/lib/kubelet/pods/73dd05f7-2cc4-4a99-b12d-26e4d436acca/volumes" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.260443 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:25:03 crc kubenswrapper[4795]: W0320 17:25:03.280114 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79f11dc_5b5e_4929_9a6f_281ade73c24a.slice/crio-5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd WatchSource:0}: Error finding container 5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd: Status 404 returned error can't find the container with id 5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.971890 4795 generic.go:334] "Generic (PLEG): container finished" podID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerID="ec6a69189563a780b942ae970e8e1801846953cabcf1239c190354a1203053b4" exitCode=0 Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.972002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"ec6a69189563a780b942ae970e8e1801846953cabcf1239c190354a1203053b4"} Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.972239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerStarted","Data":"5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd"} Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.973654 4795 generic.go:334] "Generic (PLEG): container finished" podID="38a67438-04e3-433b-9b32-47acf98b3086" containerID="2b59f88f4a5d6912f063edde8df8197451820dd6052a0a1a69bab7cad387c6f9" exitCode=0 Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.974389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerDied","Data":"2b59f88f4a5d6912f063edde8df8197451820dd6052a0a1a69bab7cad387c6f9"} Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.974430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerStarted","Data":"774dbfb3b638328353f7b91c57600c5171af1dd49c62166305a2e0b942f5bd70"} Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.847518 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n22t9"] Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.848795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.852473 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.867389 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n22t9"] Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.979392 4795 generic.go:334] "Generic (PLEG): container finished" podID="38a67438-04e3-433b-9b32-47acf98b3086" containerID="8a6a21b4b6e94c1716bbf31482dd407fbaadbaad29cd8b437718a3eed01d163b" exitCode=0 Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.979440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerDied","Data":"8a6a21b4b6e94c1716bbf31482dd407fbaadbaad29cd8b437718a3eed01d163b"} Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.981812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jkr4\" (UniqueName: \"kubernetes.io/projected/fb801735-41d3-4c6e-b9e7-083ad510100a-kube-api-access-2jkr4\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.981952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-catalog-content\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.982026 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-utilities\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.053551 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.054740 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.074886 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.088430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-catalog-content\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.088537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-utilities\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.088640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jkr4\" (UniqueName: \"kubernetes.io/projected/fb801735-41d3-4c6e-b9e7-083ad510100a-kube-api-access-2jkr4\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.089649 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-utilities\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.089872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-catalog-content\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.092857 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.108478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jkr4\" (UniqueName: \"kubernetes.io/projected/fb801735-41d3-4c6e-b9e7-083ad510100a-kube-api-access-2jkr4\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.181419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.190073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.190132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.190168 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.321712 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.369012 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n22t9"] Mar 20 17:25:05 crc kubenswrapper[4795]: W0320 17:25:05.378818 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb801735_41d3_4c6e_b9e7_083ad510100a.slice/crio-10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767 WatchSource:0}: Error finding container 10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767: Status 404 returned error can't find the container with id 10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.400085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.569158 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:25:05 crc kubenswrapper[4795]: W0320 17:25:05.585408 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cba71d7_62e8_4541_9728_23dd5ff4b982.slice/crio-dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5 WatchSource:0}: Error finding container dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5: Status 404 returned error can't find the container with id dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.986569 4795 generic.go:334] "Generic (PLEG): container finished" podID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerID="099eb6fe1b44619943ee789acf319c90001ea00f649ef59a36a0aa98e76bd549" exitCode=0 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.986646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"099eb6fe1b44619943ee789acf319c90001ea00f649ef59a36a0aa98e76bd549"} Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.986676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerStarted","Data":"dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5"} Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.988063 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb801735-41d3-4c6e-b9e7-083ad510100a" containerID="f2d66cb7edd3c4882b542108d443235dd96024cbaf91d61c20130ec249f2d423" exitCode=0 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.988103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerDied","Data":"f2d66cb7edd3c4882b542108d443235dd96024cbaf91d61c20130ec249f2d423"} Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.988128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerStarted","Data":"10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767"} Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:06.998170 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerStarted","Data":"502dc732bec40dfb721185d1e3ed96c41b58a81919acd65a8913bb1006af66d6"} Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:07.001320 4795 generic.go:334] "Generic (PLEG): container finished" podID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerID="88b194a74064309f622b8e25f76f210948d20e5936b41beb91453d2773fb7483" exitCode=0 Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:07.001345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"88b194a74064309f622b8e25f76f210948d20e5936b41beb91453d2773fb7483"} Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:07.019222 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nwm6j" podStartSLOduration=3.107960716 podStartE2EDuration="5.01919827s" podCreationTimestamp="2026-03-20 17:25:02 +0000 UTC" firstStartedPulling="2026-03-20 17:25:03.975544703 +0000 UTC m=+447.433576234" lastFinishedPulling="2026-03-20 17:25:05.886782217 +0000 UTC m=+449.344813788" observedRunningTime="2026-03-20 17:25:07.017169254 +0000 UTC m=+450.475200815" watchObservedRunningTime="2026-03-20 17:25:07.01919827 +0000 UTC m=+450.477229811" Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.009493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerStarted","Data":"4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2"} Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.012242 4795 generic.go:334] "Generic (PLEG): container finished" podID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerID="8963fe7721a09d9f6c228e790432497f6b1fff70d60afc4485e7fcd92391890f" exitCode=0 Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.012443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"8963fe7721a09d9f6c228e790432497f6b1fff70d60afc4485e7fcd92391890f"} Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.015289 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb801735-41d3-4c6e-b9e7-083ad510100a" containerID="b4ccc0976c4f09d21e82657e8bb7bfc336959e539c86beb3d45bb5f675188895" exitCode=0 Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.016300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerDied","Data":"b4ccc0976c4f09d21e82657e8bb7bfc336959e539c86beb3d45bb5f675188895"} Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.037717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94mw5" podStartSLOduration=2.20988869 podStartE2EDuration="6.037667991s" podCreationTimestamp="2026-03-20 17:25:02 +0000 UTC" firstStartedPulling="2026-03-20 17:25:03.974417246 +0000 UTC m=+447.432448787" lastFinishedPulling="2026-03-20 17:25:07.802196517 +0000 UTC m=+451.260228088" observedRunningTime="2026-03-20 17:25:08.034283432 +0000 UTC m=+451.492314983" watchObservedRunningTime="2026-03-20 17:25:08.037667991 +0000 UTC m=+451.495699542" Mar 20 17:25:09 crc kubenswrapper[4795]: I0320 17:25:09.023580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerStarted","Data":"e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b"} Mar 20 17:25:09 crc kubenswrapper[4795]: I0320 17:25:09.042543 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tw8kt" podStartSLOduration=1.424651882 podStartE2EDuration="4.042525416s" podCreationTimestamp="2026-03-20 17:25:05 +0000 UTC" firstStartedPulling="2026-03-20 17:25:06.031289228 +0000 UTC m=+449.489320769" lastFinishedPulling="2026-03-20 17:25:08.649162752 +0000 UTC m=+452.107194303" observedRunningTime="2026-03-20 17:25:09.038115045 +0000 UTC m=+452.496146606" watchObservedRunningTime="2026-03-20 17:25:09.042525416 +0000 UTC m=+452.500556967" Mar 20 17:25:10 crc kubenswrapper[4795]: I0320 17:25:10.031987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerStarted","Data":"07f9f94328a1c188e056e6fb632a62a1885a43085acf040b09ef3b2300259b0f"} Mar 20 17:25:10 crc kubenswrapper[4795]: I0320 17:25:10.056987 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n22t9" podStartSLOduration=3.202547156 podStartE2EDuration="6.056969079s" podCreationTimestamp="2026-03-20 17:25:04 +0000 UTC" firstStartedPulling="2026-03-20 17:25:06.031371681 +0000 UTC m=+449.489403262" lastFinishedPulling="2026-03-20 17:25:08.885793634 +0000 UTC m=+452.343825185" observedRunningTime="2026-03-20 17:25:10.055520982 +0000 UTC m=+453.513552533" watchObservedRunningTime="2026-03-20 17:25:10.056969079 +0000 UTC m=+453.515000620" Mar 20 17:25:11 crc kubenswrapper[4795]: I0320 17:25:11.300464 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:25:11 crc kubenswrapper[4795]: I0320 17:25:11.300518 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:25:12 crc kubenswrapper[4795]: I0320 17:25:12.778357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:12 crc kubenswrapper[4795]: I0320 17:25:12.778931 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:12 crc kubenswrapper[4795]: I0320 17:25:12.843135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.030256 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.030315 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.087532 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.127577 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.149407 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.182000 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.182078 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.401070 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.401172 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.466875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:16 crc kubenswrapper[4795]: I0320 17:25:16.108658 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:16 crc kubenswrapper[4795]: I0320 17:25:16.239412 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n22t9" podUID="fb801735-41d3-4c6e-b9e7-083ad510100a" containerName="registry-server" probeResult="failure" output=< Mar 20 17:25:16 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:25:16 crc kubenswrapper[4795]: > Mar 20 17:25:20 crc kubenswrapper[4795]: I0320 17:25:20.932332 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:21 crc kubenswrapper[4795]: I0320 17:25:21.028671 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:25:25 crc kubenswrapper[4795]: I0320 17:25:25.263818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:25 crc kubenswrapper[4795]: I0320 17:25:25.334435 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.300422 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.301059 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.301143 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.302120 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.302210 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b" gracePeriod=600 Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.419785 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b" exitCode=0 Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.419914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b"} Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.420244 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f"} Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.420265 4795 scope.go:117] "RemoveContainer" containerID="6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.069293 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" containerID="cri-o://76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a" gracePeriod=30 Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449578 4795 generic.go:334] "Generic (PLEG): container finished" podID="3dde633a-aefe-4c9b-84a7-301279016583" containerID="76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a" exitCode=0 Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449726 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerDied","Data":"76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a"} Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerDied","Data":"3db277134197ff5142f4f0d85c126502b98d3bb29670b6c4409e582bcdf40d86"} Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449952 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3db277134197ff5142f4f0d85c126502b98d3bb29670b6c4409e582bcdf40d86" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.452660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.465836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.465909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.465959 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.466016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.466065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.466910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.467181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.467313 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.467862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.468030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.468360 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.468392 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.473587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.474519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.476950 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.513384 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.514677 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk" (OuterVolumeSpecName: "kube-api-access-p2bwk") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "kube-api-access-p2bwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.518013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570014 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570059 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570080 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570098 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570115 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:47 crc kubenswrapper[4795]: I0320 17:25:47.456551 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:25:47 crc kubenswrapper[4795]: I0320 17:25:47.486549 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:25:47 crc kubenswrapper[4795]: I0320 17:25:47.499813 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:25:49 crc kubenswrapper[4795]: I0320 17:25:49.263025 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dde633a-aefe-4c9b-84a7-301279016583" path="/var/lib/kubelet/pods/3dde633a-aefe-4c9b-84a7-301279016583/volumes" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.146785 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:26:00 crc kubenswrapper[4795]: E0320 17:26:00.148005 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.148033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.148291 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.149136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.156516 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.157378 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.157868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.158286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.256862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"auto-csr-approver-29567126-nhz8w\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.358802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"auto-csr-approver-29567126-nhz8w\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.393662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"auto-csr-approver-29567126-nhz8w\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.482751 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.735914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:26:00 crc kubenswrapper[4795]: W0320 17:26:00.744453 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod740c1ddf_96e5_46f6_837c_73372748464e.slice/crio-127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6 WatchSource:0}: Error finding container 127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6: Status 404 returned error can't find the container with id 127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6 Mar 20 17:26:01 crc kubenswrapper[4795]: I0320 17:26:01.573993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" event={"ID":"740c1ddf-96e5-46f6-837c-73372748464e","Type":"ContainerStarted","Data":"127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6"} Mar 20 17:26:02 crc kubenswrapper[4795]: I0320 17:26:02.584071 4795 generic.go:334] "Generic (PLEG): container finished" podID="740c1ddf-96e5-46f6-837c-73372748464e" containerID="ea095688dd8877661afbf85ce172a04981e2524e4cbc5e45ea0fa637fadfbc39" exitCode=0 Mar 20 17:26:02 crc kubenswrapper[4795]: I0320 17:26:02.584211 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" event={"ID":"740c1ddf-96e5-46f6-837c-73372748464e","Type":"ContainerDied","Data":"ea095688dd8877661afbf85ce172a04981e2524e4cbc5e45ea0fa637fadfbc39"} Mar 20 17:26:03 crc kubenswrapper[4795]: I0320 17:26:03.834470 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.006668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"740c1ddf-96e5-46f6-837c-73372748464e\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.015873 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h" (OuterVolumeSpecName: "kube-api-access-8cn9h") pod "740c1ddf-96e5-46f6-837c-73372748464e" (UID: "740c1ddf-96e5-46f6-837c-73372748464e"). InnerVolumeSpecName "kube-api-access-8cn9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.108166 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") on node \"crc\" DevicePath \"\"" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.604304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" event={"ID":"740c1ddf-96e5-46f6-837c-73372748464e","Type":"ContainerDied","Data":"127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6"} Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.604362 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.604396 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.915514 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.920566 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:26:05 crc kubenswrapper[4795]: I0320 17:26:05.260512 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" path="/var/lib/kubelet/pods/bed1d31b-b060-45c3-95bf-3b226a36efe1/volumes" Mar 20 17:27:49 crc kubenswrapper[4795]: I0320 17:27:49.085854 4795 scope.go:117] "RemoveContainer" containerID="76aa98549ce46db60ce0a3b7fd4c6b9ed28e4c1b7375fc84abcdb33fcf4ef287" Mar 20 17:27:49 crc kubenswrapper[4795]: I0320 17:27:49.135497 4795 scope.go:117] "RemoveContainer" containerID="76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.149637 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:28:00 crc kubenswrapper[4795]: E0320 17:28:00.150324 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740c1ddf-96e5-46f6-837c-73372748464e" containerName="oc" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.150345 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="740c1ddf-96e5-46f6-837c-73372748464e" containerName="oc" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.150525 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="740c1ddf-96e5-46f6-837c-73372748464e" containerName="oc" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.151151 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.154759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.154826 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.161298 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.165145 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.165256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"auto-csr-approver-29567128-bqp8h\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.267046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"auto-csr-approver-29567128-bqp8h\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.301642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"auto-csr-approver-29567128-bqp8h\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.479447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.980771 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.989877 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:28:01 crc kubenswrapper[4795]: I0320 17:28:01.438326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" event={"ID":"4be9f091-42a0-432b-8f14-700bc3e733cb","Type":"ContainerStarted","Data":"cfd52f60c14ca3ef1e9f12424bfa6942e34a20ac9b57389e7f65a806f4083442"} Mar 20 17:28:03 crc kubenswrapper[4795]: I0320 17:28:03.469468 4795 generic.go:334] "Generic (PLEG): container finished" podID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerID="326b8c75bc495d3f796856aa4f0f247f31974ad88ddb26ad9ca2ca9ec8cf372a" exitCode=0 Mar 20 17:28:03 crc kubenswrapper[4795]: I0320 17:28:03.469536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" event={"ID":"4be9f091-42a0-432b-8f14-700bc3e733cb","Type":"ContainerDied","Data":"326b8c75bc495d3f796856aa4f0f247f31974ad88ddb26ad9ca2ca9ec8cf372a"} Mar 20 17:28:04 crc kubenswrapper[4795]: I0320 17:28:04.809474 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:04 crc kubenswrapper[4795]: I0320 17:28:04.934270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"4be9f091-42a0-432b-8f14-700bc3e733cb\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " Mar 20 17:28:04 crc kubenswrapper[4795]: I0320 17:28:04.940727 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4" (OuterVolumeSpecName: "kube-api-access-7lhp4") pod "4be9f091-42a0-432b-8f14-700bc3e733cb" (UID: "4be9f091-42a0-432b-8f14-700bc3e733cb"). InnerVolumeSpecName "kube-api-access-7lhp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.036838 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.485609 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" event={"ID":"4be9f091-42a0-432b-8f14-700bc3e733cb","Type":"ContainerDied","Data":"cfd52f60c14ca3ef1e9f12424bfa6942e34a20ac9b57389e7f65a806f4083442"} Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.485671 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfd52f60c14ca3ef1e9f12424bfa6942e34a20ac9b57389e7f65a806f4083442" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.485791 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.889755 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.893739 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:28:07 crc kubenswrapper[4795]: I0320 17:28:07.263133 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" path="/var/lib/kubelet/pods/a0486c12-c384-46ff-925b-bfeefb1d59bb/volumes" Mar 20 17:28:11 crc kubenswrapper[4795]: I0320 17:28:11.300203 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:28:11 crc kubenswrapper[4795]: I0320 17:28:11.300283 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:28:41 crc kubenswrapper[4795]: I0320 17:28:41.300800 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:28:41 crc kubenswrapper[4795]: I0320 17:28:41.303083 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:28:49 crc kubenswrapper[4795]: I0320 17:28:49.188941 4795 scope.go:117] "RemoveContainer" containerID="14e15a12796f646063cb5f653e99e6ad23f1724726dfb97b08e9621c085665c1" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.300494 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.302205 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.302368 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.303382 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.303491 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f" gracePeriod=600 Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260293 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f" exitCode=0 Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f"} Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842"} Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260953 4795 scope.go:117] "RemoveContainer" containerID="c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.140718 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:30:00 crc kubenswrapper[4795]: E0320 17:30:00.141622 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerName="oc" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.141645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerName="oc" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.141875 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerName="oc" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.142589 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.145910 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.146659 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.147509 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.150789 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.151070 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.152027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.152518 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.161756 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.169120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.323856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"auto-csr-approver-29567130-kh5md\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.324006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.324070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.324154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.424867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.424971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"auto-csr-approver-29567130-kh5md\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.425033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.425065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.426752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.440172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.445671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.447766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"auto-csr-approver-29567130-kh5md\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.465403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.477558 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.729774 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.794966 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 17:30:00 crc kubenswrapper[4795]: W0320 17:30:00.801915 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6892589_ca9a_45cc_8991_ab0029e67e3c.slice/crio-f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0 WatchSource:0}: Error finding container f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0: Status 404 returned error can't find the container with id f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0 Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.592999 4795 generic.go:334] "Generic (PLEG): container finished" podID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerID="c957ead85ece246848e605f8f78734d00ae750bd985db9f200ae787909bd1425" exitCode=0 Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.593061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" event={"ID":"a6892589-ca9a-45cc-8991-ab0029e67e3c","Type":"ContainerDied","Data":"c957ead85ece246848e605f8f78734d00ae750bd985db9f200ae787909bd1425"} Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.593491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" event={"ID":"a6892589-ca9a-45cc-8991-ab0029e67e3c","Type":"ContainerStarted","Data":"f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0"} Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.596940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-kh5md" event={"ID":"f93986a1-82a8-4eac-ba5e-f790196b25ce","Type":"ContainerStarted","Data":"f2c40797c31a64b650cf210f5c7ba69205ad6d31cc00e971cfce59abdfa1f4eb"} Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.604317 4795 generic.go:334] "Generic (PLEG): container finished" podID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerID="56b4e175842a208b79b6d416a354b0c057585a391bd973a1b6ce26b23a0cd738" exitCode=0 Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.604422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-kh5md" event={"ID":"f93986a1-82a8-4eac-ba5e-f790196b25ce","Type":"ContainerDied","Data":"56b4e175842a208b79b6d416a354b0c057585a391bd973a1b6ce26b23a0cd738"} Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.811549 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.955607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"a6892589-ca9a-45cc-8991-ab0029e67e3c\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.956030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"a6892589-ca9a-45cc-8991-ab0029e67e3c\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.956141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"a6892589-ca9a-45cc-8991-ab0029e67e3c\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.956785 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a6892589-ca9a-45cc-8991-ab0029e67e3c" (UID: "a6892589-ca9a-45cc-8991-ab0029e67e3c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.961532 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb" (OuterVolumeSpecName: "kube-api-access-rhbvb") pod "a6892589-ca9a-45cc-8991-ab0029e67e3c" (UID: "a6892589-ca9a-45cc-8991-ab0029e67e3c"). InnerVolumeSpecName "kube-api-access-rhbvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.964944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a6892589-ca9a-45cc-8991-ab0029e67e3c" (UID: "a6892589-ca9a-45cc-8991-ab0029e67e3c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.057250 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.057302 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.057328 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.615621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.615711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" event={"ID":"a6892589-ca9a-45cc-8991-ab0029e67e3c","Type":"ContainerDied","Data":"f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0"} Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.615773 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.870978 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.070107 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"f93986a1-82a8-4eac-ba5e-f790196b25ce\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.075936 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss" (OuterVolumeSpecName: "kube-api-access-sx5ss") pod "f93986a1-82a8-4eac-ba5e-f790196b25ce" (UID: "f93986a1-82a8-4eac-ba5e-f790196b25ce"). InnerVolumeSpecName "kube-api-access-sx5ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.172174 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.622984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-kh5md" event={"ID":"f93986a1-82a8-4eac-ba5e-f790196b25ce","Type":"ContainerDied","Data":"f2c40797c31a64b650cf210f5c7ba69205ad6d31cc00e971cfce59abdfa1f4eb"} Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.623043 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c40797c31a64b650cf210f5c7ba69205ad6d31cc00e971cfce59abdfa1f4eb" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.623070 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.950013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.956953 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:30:05 crc kubenswrapper[4795]: I0320 17:30:05.265526 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" path="/var/lib/kubelet/pods/35c14395-0a4c-47be-8f64-382e60e3faad/volumes" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.593475 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-smr2n"] Mar 20 17:30:37 crc kubenswrapper[4795]: E0320 17:30:37.596206 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerName="oc" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.596419 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerName="oc" Mar 20 17:30:37 crc kubenswrapper[4795]: E0320 17:30:37.596850 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerName="collect-profiles" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.596966 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerName="collect-profiles" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.597322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerName="collect-profiles" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.597552 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerName="oc" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.598583 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.601403 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.601705 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-f2z74" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.601829 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.614744 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-smr2n"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.624787 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-lqmsr"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.625448 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.627129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-c75zp" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.634215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cff8c"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.640078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.645353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-c7fr7" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.674117 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cff8c"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.690534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lqmsr"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.772992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfwf\" (UniqueName: \"kubernetes.io/projected/88832f68-9f72-4321-8d3f-bb3e23465fdb-kube-api-access-mmfwf\") pod \"cert-manager-webhook-687f57d79b-cff8c\" (UID: \"88832f68-9f72-4321-8d3f-bb3e23465fdb\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.773066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwwtf\" (UniqueName: \"kubernetes.io/projected/7df834a3-0298-4cc9-8b4e-49ce3f51183e-kube-api-access-pwwtf\") pod \"cert-manager-cainjector-cf98fcc89-smr2n\" (UID: \"7df834a3-0298-4cc9-8b4e-49ce3f51183e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.773090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dzz\" (UniqueName: \"kubernetes.io/projected/5231a25a-8bda-4f72-8a81-e5a49cdc31eb-kube-api-access-58dzz\") pod \"cert-manager-858654f9db-lqmsr\" (UID: \"5231a25a-8bda-4f72-8a81-e5a49cdc31eb\") " pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.874332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfwf\" (UniqueName: \"kubernetes.io/projected/88832f68-9f72-4321-8d3f-bb3e23465fdb-kube-api-access-mmfwf\") pod \"cert-manager-webhook-687f57d79b-cff8c\" (UID: \"88832f68-9f72-4321-8d3f-bb3e23465fdb\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.874506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwwtf\" (UniqueName: \"kubernetes.io/projected/7df834a3-0298-4cc9-8b4e-49ce3f51183e-kube-api-access-pwwtf\") pod \"cert-manager-cainjector-cf98fcc89-smr2n\" (UID: \"7df834a3-0298-4cc9-8b4e-49ce3f51183e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.874549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dzz\" (UniqueName: \"kubernetes.io/projected/5231a25a-8bda-4f72-8a81-e5a49cdc31eb-kube-api-access-58dzz\") pod \"cert-manager-858654f9db-lqmsr\" (UID: \"5231a25a-8bda-4f72-8a81-e5a49cdc31eb\") " pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.895516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dzz\" (UniqueName: \"kubernetes.io/projected/5231a25a-8bda-4f72-8a81-e5a49cdc31eb-kube-api-access-58dzz\") pod \"cert-manager-858654f9db-lqmsr\" (UID: \"5231a25a-8bda-4f72-8a81-e5a49cdc31eb\") " pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.900707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwwtf\" (UniqueName: \"kubernetes.io/projected/7df834a3-0298-4cc9-8b4e-49ce3f51183e-kube-api-access-pwwtf\") pod \"cert-manager-cainjector-cf98fcc89-smr2n\" (UID: \"7df834a3-0298-4cc9-8b4e-49ce3f51183e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.911424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfwf\" (UniqueName: \"kubernetes.io/projected/88832f68-9f72-4321-8d3f-bb3e23465fdb-kube-api-access-mmfwf\") pod \"cert-manager-webhook-687f57d79b-cff8c\" (UID: \"88832f68-9f72-4321-8d3f-bb3e23465fdb\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.923001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.948509 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.963143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.137241 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-smr2n"] Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.157179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" event={"ID":"7df834a3-0298-4cc9-8b4e-49ce3f51183e","Type":"ContainerStarted","Data":"efd5b0b29a9788b0c0d1841898dab5499cecdede2ffed8cf44a49ca57874eabf"} Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.392223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lqmsr"] Mar 20 17:30:38 crc kubenswrapper[4795]: W0320 17:30:38.395331 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5231a25a_8bda_4f72_8a81_e5a49cdc31eb.slice/crio-a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366 WatchSource:0}: Error finding container a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366: Status 404 returned error can't find the container with id a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366 Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.395782 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cff8c"] Mar 20 17:30:38 crc kubenswrapper[4795]: W0320 17:30:38.398768 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88832f68_9f72_4321_8d3f_bb3e23465fdb.slice/crio-f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff WatchSource:0}: Error finding container f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff: Status 404 returned error can't find the container with id f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff Mar 20 17:30:39 crc kubenswrapper[4795]: I0320 17:30:39.164786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lqmsr" event={"ID":"5231a25a-8bda-4f72-8a81-e5a49cdc31eb","Type":"ContainerStarted","Data":"a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366"} Mar 20 17:30:39 crc kubenswrapper[4795]: I0320 17:30:39.165841 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" event={"ID":"88832f68-9f72-4321-8d3f-bb3e23465fdb","Type":"ContainerStarted","Data":"f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff"} Mar 20 17:30:43 crc kubenswrapper[4795]: I0320 17:30:43.469974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" event={"ID":"7df834a3-0298-4cc9-8b4e-49ce3f51183e","Type":"ContainerStarted","Data":"5f77e9227173777a2cb8dcfa07abdecd019fd9d7c0e54e1e1c537afb1f54b789"} Mar 20 17:30:43 crc kubenswrapper[4795]: I0320 17:30:43.501835 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" podStartSLOduration=4.124181095 podStartE2EDuration="6.501813893s" podCreationTimestamp="2026-03-20 17:30:37 +0000 UTC" firstStartedPulling="2026-03-20 17:30:38.145805883 +0000 UTC m=+781.603837424" lastFinishedPulling="2026-03-20 17:30:40.523438661 +0000 UTC m=+783.981470222" observedRunningTime="2026-03-20 17:30:43.487887213 +0000 UTC m=+786.945918754" watchObservedRunningTime="2026-03-20 17:30:43.501813893 +0000 UTC m=+786.959845434" Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.479907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lqmsr" event={"ID":"5231a25a-8bda-4f72-8a81-e5a49cdc31eb","Type":"ContainerStarted","Data":"4e8a02576ca9d01c5c5222d196331e0bb56a92554c3c73fdb749b55304af5769"} Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.482200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" event={"ID":"88832f68-9f72-4321-8d3f-bb3e23465fdb","Type":"ContainerStarted","Data":"a3cbde9ffa21c0111c085397b230b40fd95ebae1d34a07cab60bd797651e9143"} Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.482737 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.527071 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-lqmsr" podStartSLOduration=2.279664895 podStartE2EDuration="7.527054003s" podCreationTimestamp="2026-03-20 17:30:37 +0000 UTC" firstStartedPulling="2026-03-20 17:30:38.398355354 +0000 UTC m=+781.856386895" lastFinishedPulling="2026-03-20 17:30:43.645744442 +0000 UTC m=+787.103776003" observedRunningTime="2026-03-20 17:30:44.50007845 +0000 UTC m=+787.958110051" watchObservedRunningTime="2026-03-20 17:30:44.527054003 +0000 UTC m=+787.985085544" Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.529165 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" podStartSLOduration=2.216598082 podStartE2EDuration="7.529159319s" podCreationTimestamp="2026-03-20 17:30:37 +0000 UTC" firstStartedPulling="2026-03-20 17:30:38.400605925 +0000 UTC m=+781.858637466" lastFinishedPulling="2026-03-20 17:30:43.713167162 +0000 UTC m=+787.171198703" observedRunningTime="2026-03-20 17:30:44.525763712 +0000 UTC m=+787.983795273" watchObservedRunningTime="2026-03-20 17:30:44.529159319 +0000 UTC m=+787.987190860" Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.734380 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.736483 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" containerID="cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.736954 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" containerID="cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737001 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" containerID="cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737087 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" containerID="cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.736974 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737173 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" containerID="cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737238 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" containerID="cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.830280 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" containerID="cri-o://c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" gracePeriod=30 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.115829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.121652 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-acl-logging/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.122750 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-controller/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.123494 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.203869 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-srjsg"] Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204275 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204312 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204339 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204357 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204380 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204399 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204429 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204448 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204477 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204495 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204524 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204545 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204576 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204593 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204622 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204640 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kubecfg-setup" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204680 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kubecfg-setup" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204743 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204761 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204783 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204800 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204827 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204846 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205068 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205102 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205125 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205150 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205172 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205197 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205215 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205237 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205252 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205266 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205283 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.205476 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205492 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205660 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.209794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261957 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262133 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262432 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262466 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262103 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262147 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262164 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log" (OuterVolumeSpecName: "node-log") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket" (OuterVolumeSpecName: "log-socket") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash" (OuterVolumeSpecName: "host-slash") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262935 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.263101 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.271527 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.271796 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5" (OuterVolumeSpecName: "kube-api-access-4vrl5") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "kube-api-access-4vrl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.277031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363198 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-systemd-units\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-etc-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-log-socket\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-bin\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-node-log\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363608 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-systemd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-var-lib-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-script-lib\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363796 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-slash\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-netns\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-ovn\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpdl\" (UniqueName: \"kubernetes.io/projected/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-kube-api-access-zxpdl\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363967 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-env-overrides\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovn-node-metrics-cert\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-netd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-kubelet\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-config\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364618 4795 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364644 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364728 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364751 4795 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364771 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364790 4795 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364809 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364825 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364843 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364859 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364877 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364895 4795 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364914 4795 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364931 4795 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365035 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365068 4795 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365125 4795 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365144 4795 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365166 4795 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.465977 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-netd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466101 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-kubelet\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-config\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-systemd-units\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-etc-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466323 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-netd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-log-socket\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-systemd-units\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-bin\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-node-log\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-systemd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-var-lib-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-script-lib\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-slash\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-netns\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466637 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-ovn\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpdl\" (UniqueName: \"kubernetes.io/projected/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-kube-api-access-zxpdl\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-env-overrides\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-var-lib-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466963 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-ovn\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-etc-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-kubelet\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-log-socket\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-bin\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-node-log\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467240 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-systemd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-slash\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-netns\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-config\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.468041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-env-overrides\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.468153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-script-lib\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.468744 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovn-node-metrics-cert\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.472485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovn-node-metrics-cert\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.487399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpdl\" (UniqueName: \"kubernetes.io/projected/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-kube-api-access-zxpdl\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.539405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.543211 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/2.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544071 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544145 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" containerID="199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7" exitCode=2 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerDied","Data":"199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544282 4795 scope.go:117] "RemoveContainer" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544999 4795 scope.go:117] "RemoveContainer" containerID="199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.545321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xxwb6_openshift-multus(c8c31a7c-6ccb-43e0-9c95-33b85204cc39)\"" pod="openshift-multus/multus-xxwb6" podUID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.549530 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.555136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-acl-logging/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.556418 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-controller/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557768 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557813 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557829 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557842 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557856 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557869 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557889 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" exitCode=143 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557904 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" exitCode=143 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557951 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558079 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558095 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558107 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558119 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558130 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558142 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558153 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558164 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558175 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558187 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558221 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558235 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558245 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558255 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558266 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558276 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558286 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558296 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558306 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558316 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558345 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558357 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558368 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558379 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558390 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558400 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558411 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558422 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558432 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558442 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558471 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558482 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558493 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558505 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558515 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558527 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558537 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558549 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558559 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558569 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: W0320 17:30:48.569951 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b3d8c82_869d_4762_8d8b_56d9d2d2c9e5.slice/crio-7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365 WatchSource:0}: Error finding container 7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365: Status 404 returned error can't find the container with id 7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.593375 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.630339 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.632070 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.638452 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.662813 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.688629 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.710961 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.739521 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.757484 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.831826 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.848485 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.867142 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.913507 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.914045 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914110 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914154 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.914549 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914576 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914590 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.914857 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914878 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914891 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915085 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915146 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915162 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915411 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915427 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915440 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915644 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915662 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915677 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915876 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915905 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915919 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.916105 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916132 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916147 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.916460 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916527 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916569 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.916901 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916929 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916943 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917155 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917176 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917415 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917442 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917738 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917814 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918114 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918149 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918433 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918459 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918637 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918660 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918904 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918934 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919138 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919161 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919330 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919352 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919508 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919527 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919707 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919724 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919891 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919910 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920068 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920101 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920252 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920270 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920444 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920460 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920621 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920638 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920808 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920827 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921007 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921030 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921216 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921233 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921405 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921430 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921586 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921604 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921786 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921800 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921978 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921994 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922145 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922158 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922349 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922362 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922526 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922548 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922730 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922747 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923018 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923042 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923461 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923813 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.265272 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" path="/var/lib/kubelet/pods/520bb74b-cfa2-4f21-b561-989b0a3d6adc/volumes" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.288784 4795 scope.go:117] "RemoveContainer" containerID="e208a8a62ce5332bce059cfe9498a63b10989e2ede473bf8237789de0f3da7f0" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.566844 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/2.log" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.568830 4795 generic.go:334] "Generic (PLEG): container finished" podID="4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5" containerID="005e00657a18f9011e09c890eb2968bb56738f35496a3e1f6ab829d77d35eee1" exitCode=0 Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.568899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerDied","Data":"005e00657a18f9011e09c890eb2968bb56738f35496a3e1f6ab829d77d35eee1"} Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.568948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"e59bf0716bd255cd688ae57e9f097ab38b7b6f13866739bd382abf1678e530ee"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"37577f9e1c030e30cba8d0dd22808abddd25b232d48bd61cb38ce696ab3a22f2"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"1ce5bcd753e77cb30874142addbe046b6b8282c35867894de9e349ff63feac8f"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"d4c6d1d1d49f3494c84d195c8c4d887bc335bb03491acece41d188edc99c984c"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"35eaf00d26807dcd790c0828962a838138c457f4e57cfc0916c67c9e0c56b252"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"c3b0a5cd96aa4486c545a0503e67b092787fa04696de6cb65fc17979ce116958"} Mar 20 17:30:52 crc kubenswrapper[4795]: I0320 17:30:52.967488 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:53 crc kubenswrapper[4795]: I0320 17:30:53.605339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"5c3504372687ee1d8851e55ffda227591f5e52a7066c585c151dae88a51adaef"} Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"1063cc41fdd40a7916837151f74b6dfe5ea45cbc82aea21b74b7497907b819fb"} Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623899 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623916 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623929 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.657011 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.657775 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.676361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" podStartSLOduration=7.676336363 podStartE2EDuration="7.676336363s" podCreationTimestamp="2026-03-20 17:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:30:55.670912282 +0000 UTC m=+799.128943833" watchObservedRunningTime="2026-03-20 17:30:55.676336363 +0000 UTC m=+799.134367944" Mar 20 17:31:00 crc kubenswrapper[4795]: I0320 17:31:00.252505 4795 scope.go:117] "RemoveContainer" containerID="199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7" Mar 20 17:31:00 crc kubenswrapper[4795]: I0320 17:31:00.732821 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/2.log" Mar 20 17:31:00 crc kubenswrapper[4795]: I0320 17:31:00.733327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"940c532728c5764b45b49b713e9e2b429773e60bb316ae666156a65d826c3a77"} Mar 20 17:31:11 crc kubenswrapper[4795]: I0320 17:31:11.300445 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:31:11 crc kubenswrapper[4795]: I0320 17:31:11.300902 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:31:18 crc kubenswrapper[4795]: I0320 17:31:18.576993 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:31:27 crc kubenswrapper[4795]: I0320 17:31:27.764273 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.442337 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h"] Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.444770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.448678 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.459713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h"] Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.589585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.589763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.589833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.691830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.691976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.692050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.693129 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.695355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.712560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.770093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.023543 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h"] Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.300639 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.301093 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.878866 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerID="85d39cd3b7573adaa8cd33998666d4c7e79dd991bc6ef15c2cea7285efac8969" exitCode=0 Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.878973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"85d39cd3b7573adaa8cd33998666d4c7e79dd991bc6ef15c2cea7285efac8969"} Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.880114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerStarted","Data":"98d214b944aebd78a2b668652156b847f1c5ec8fc4b459e0c772615f90095c54"} Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.754551 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.758462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.766894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.928524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.928791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.928847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030480 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.031086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.069095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.094680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.308865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:31:43 crc kubenswrapper[4795]: W0320 17:31:43.322142 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod596dd5ef_f287_4f26_9618_c7763a911124.slice/crio-6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290 WatchSource:0}: Error finding container 6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290: Status 404 returned error can't find the container with id 6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290 Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.890663 4795 generic.go:334] "Generic (PLEG): container finished" podID="596dd5ef-f287-4f26-9618-c7763a911124" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" exitCode=0 Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.890749 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d"} Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.890778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerStarted","Data":"6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290"} Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.893533 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerID="dd3c89461c58ed60219166c8784c9924960bbd8b2c8a5bee74fe1482bf5922a7" exitCode=0 Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.893563 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"dd3c89461c58ed60219166c8784c9924960bbd8b2c8a5bee74fe1482bf5922a7"} Mar 20 17:31:44 crc kubenswrapper[4795]: I0320 17:31:44.905278 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerID="863e88b27c645f7b01ba4a1b4c66d16ca94747da1b2a0288c2d28921ca533697" exitCode=0 Mar 20 17:31:44 crc kubenswrapper[4795]: I0320 17:31:44.905484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"863e88b27c645f7b01ba4a1b4c66d16ca94747da1b2a0288c2d28921ca533697"} Mar 20 17:31:44 crc kubenswrapper[4795]: I0320 17:31:44.909325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerStarted","Data":"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5"} Mar 20 17:31:45 crc kubenswrapper[4795]: I0320 17:31:45.920197 4795 generic.go:334] "Generic (PLEG): container finished" podID="596dd5ef-f287-4f26-9618-c7763a911124" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" exitCode=0 Mar 20 17:31:45 crc kubenswrapper[4795]: I0320 17:31:45.920581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5"} Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.307997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.378013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.378353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.378417 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.379471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle" (OuterVolumeSpecName: "bundle") pod "cf16b9b7-bdbf-48db-a358-3c32c93b3d43" (UID: "cf16b9b7-bdbf-48db-a358-3c32c93b3d43"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.381303 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.387345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv" (OuterVolumeSpecName: "kube-api-access-7rstv") pod "cf16b9b7-bdbf-48db-a358-3c32c93b3d43" (UID: "cf16b9b7-bdbf-48db-a358-3c32c93b3d43"). InnerVolumeSpecName "kube-api-access-7rstv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.484966 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") on node \"crc\" DevicePath \"\"" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.755832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util" (OuterVolumeSpecName: "util") pod "cf16b9b7-bdbf-48db-a358-3c32c93b3d43" (UID: "cf16b9b7-bdbf-48db-a358-3c32c93b3d43"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.788722 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.928342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerStarted","Data":"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42"} Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.930953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"98d214b944aebd78a2b668652156b847f1c5ec8fc4b459e0c772615f90095c54"} Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.930971 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d214b944aebd78a2b668652156b847f1c5ec8fc4b459e0c772615f90095c54" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.931010 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.968966 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ccgqp" podStartSLOduration=2.422073167 podStartE2EDuration="4.968947286s" podCreationTimestamp="2026-03-20 17:31:42 +0000 UTC" firstStartedPulling="2026-03-20 17:31:43.892478242 +0000 UTC m=+847.350509783" lastFinishedPulling="2026-03-20 17:31:46.439352331 +0000 UTC m=+849.897383902" observedRunningTime="2026-03-20 17:31:46.96499087 +0000 UTC m=+850.423022411" watchObservedRunningTime="2026-03-20 17:31:46.968947286 +0000 UTC m=+850.426978837" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.067308 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dlcps"] Mar 20 17:31:51 crc kubenswrapper[4795]: E0320 17:31:51.068094 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="extract" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068110 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="extract" Mar 20 17:31:51 crc kubenswrapper[4795]: E0320 17:31:51.068122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="util" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068129 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="util" Mar 20 17:31:51 crc kubenswrapper[4795]: E0320 17:31:51.068142 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="pull" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068149 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="pull" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068285 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="extract" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068708 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.072127 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.075857 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.076148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jpd2k" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.077553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dlcps"] Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.250764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqsh\" (UniqueName: \"kubernetes.io/projected/efca4120-31ef-4c52-a6da-59b33144a979-kube-api-access-mjqsh\") pod \"nmstate-operator-796d4cfff4-dlcps\" (UID: \"efca4120-31ef-4c52-a6da-59b33144a979\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.352678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqsh\" (UniqueName: \"kubernetes.io/projected/efca4120-31ef-4c52-a6da-59b33144a979-kube-api-access-mjqsh\") pod \"nmstate-operator-796d4cfff4-dlcps\" (UID: \"efca4120-31ef-4c52-a6da-59b33144a979\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.381312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqsh\" (UniqueName: \"kubernetes.io/projected/efca4120-31ef-4c52-a6da-59b33144a979-kube-api-access-mjqsh\") pod \"nmstate-operator-796d4cfff4-dlcps\" (UID: \"efca4120-31ef-4c52-a6da-59b33144a979\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.423899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.646305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dlcps"] Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.964697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" event={"ID":"efca4120-31ef-4c52-a6da-59b33144a979","Type":"ContainerStarted","Data":"6c4df5b572666df4d95c2f029bddd98f0cc74c84c6548cc3e6e40e6f5945b36b"} Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.095151 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.095418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.742205 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.743823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.766181 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.884670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.884739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.884967 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.986596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.986705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.986728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.987168 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.987358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.016795 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.076285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.145412 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ccgqp" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" probeResult="failure" output=< Mar 20 17:31:54 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:31:54 crc kubenswrapper[4795]: > Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.542552 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:31:55 crc kubenswrapper[4795]: W0320 17:31:55.330065 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6158d7fa_4982_4817_9c29_b1f3c3fd70d7.slice/crio-306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e WatchSource:0}: Error finding container 306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e: Status 404 returned error can't find the container with id 306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e Mar 20 17:31:55 crc kubenswrapper[4795]: I0320 17:31:55.992290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerStarted","Data":"306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e"} Mar 20 17:31:57 crc kubenswrapper[4795]: I0320 17:31:57.003197 4795 generic.go:334] "Generic (PLEG): container finished" podID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" exitCode=0 Mar 20 17:31:57 crc kubenswrapper[4795]: I0320 17:31:57.003286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4"} Mar 20 17:31:58 crc kubenswrapper[4795]: I0320 17:31:58.012856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" event={"ID":"efca4120-31ef-4c52-a6da-59b33144a979","Type":"ContainerStarted","Data":"e451b6c96025b3dc22e7abc27afcac5348c1901cd3239785abeeac594944b56a"} Mar 20 17:31:58 crc kubenswrapper[4795]: I0320 17:31:58.043515 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" podStartSLOduration=1.477015384 podStartE2EDuration="7.043494583s" podCreationTimestamp="2026-03-20 17:31:51 +0000 UTC" firstStartedPulling="2026-03-20 17:31:51.660648104 +0000 UTC m=+855.118679645" lastFinishedPulling="2026-03-20 17:31:57.227127293 +0000 UTC m=+860.685158844" observedRunningTime="2026-03-20 17:31:58.033027801 +0000 UTC m=+861.491059372" watchObservedRunningTime="2026-03-20 17:31:58.043494583 +0000 UTC m=+861.501526124" Mar 20 17:31:59 crc kubenswrapper[4795]: I0320 17:31:59.023092 4795 generic.go:334] "Generic (PLEG): container finished" podID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" exitCode=0 Mar 20 17:31:59 crc kubenswrapper[4795]: I0320 17:31:59.023205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386"} Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.035426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerStarted","Data":"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5"} Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.059299 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4gbg2" podStartSLOduration=4.870434044 podStartE2EDuration="7.059272179s" podCreationTimestamp="2026-03-20 17:31:53 +0000 UTC" firstStartedPulling="2026-03-20 17:31:57.218384726 +0000 UTC m=+860.676416287" lastFinishedPulling="2026-03-20 17:31:59.407222851 +0000 UTC m=+862.865254422" observedRunningTime="2026-03-20 17:32:00.057661338 +0000 UTC m=+863.515692899" watchObservedRunningTime="2026-03-20 17:32:00.059272179 +0000 UTC m=+863.517303750" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.146911 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.148033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.151368 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.153103 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.153441 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.156884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.273276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"auto-csr-approver-29567132-b9gh7\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.374292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"auto-csr-approver-29567132-b9gh7\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.395427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"auto-csr-approver-29567132-b9gh7\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.471913 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.644471 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.645573 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.649846 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bftb5" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.664065 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.667723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.668362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.669932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.674030 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bsp49"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.674959 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.685969 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqnfh\" (UniqueName: \"kubernetes.io/projected/f50011ef-d180-4d84-ba10-a2da522a579d-kube-api-access-pqnfh\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9fj\" (UniqueName: \"kubernetes.io/projected/e070281f-65f5-4c6d-b012-06c027393646-kube-api-access-ch9fj\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpbn\" (UniqueName: \"kubernetes.io/projected/65c42497-77ba-49bc-a292-5003a353fde6-kube-api-access-whpbn\") pod \"nmstate-metrics-9b8c8685d-xjj2s\" (UID: \"65c42497-77ba-49bc-a292-5003a353fde6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f50011ef-d180-4d84-ba10-a2da522a579d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-ovs-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-nmstate-lock\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-dbus-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.729811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.780672 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.781310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.783365 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.783541 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.783665 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-79hq8" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f50011ef-d180-4d84-ba10-a2da522a579d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-ovs-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-nmstate-lock\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-dbus-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788480 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqnfh\" (UniqueName: \"kubernetes.io/projected/f50011ef-d180-4d84-ba10-a2da522a579d-kube-api-access-pqnfh\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9fj\" (UniqueName: \"kubernetes.io/projected/e070281f-65f5-4c6d-b012-06c027393646-kube-api-access-ch9fj\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpbn\" (UniqueName: \"kubernetes.io/projected/65c42497-77ba-49bc-a292-5003a353fde6-kube-api-access-whpbn\") pod \"nmstate-metrics-9b8c8685d-xjj2s\" (UID: \"65c42497-77ba-49bc-a292-5003a353fde6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-nmstate-lock\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.789047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-dbus-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.789390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-ovs-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.790804 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.796997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f50011ef-d180-4d84-ba10-a2da522a579d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.816925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9fj\" (UniqueName: \"kubernetes.io/projected/e070281f-65f5-4c6d-b012-06c027393646-kube-api-access-ch9fj\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.823572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqnfh\" (UniqueName: \"kubernetes.io/projected/f50011ef-d180-4d84-ba10-a2da522a579d-kube-api-access-pqnfh\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.823675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpbn\" (UniqueName: \"kubernetes.io/projected/65c42497-77ba-49bc-a292-5003a353fde6-kube-api-access-whpbn\") pod \"nmstate-metrics-9b8c8685d-xjj2s\" (UID: \"65c42497-77ba-49bc-a292-5003a353fde6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.889925 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nj4\" (UniqueName: \"kubernetes.io/projected/d34761db-41bf-4e5f-bdca-8c25e281c924-kube-api-access-69nj4\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.890344 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34761db-41bf-4e5f-bdca-8c25e281c924-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.890399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34761db-41bf-4e5f-bdca-8c25e281c924-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.970893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c5444f47d-jcv9w"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.971521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.981364 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5444f47d-jcv9w"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996319 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34761db-41bf-4e5f-bdca-8c25e281c924-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-console-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-oauth-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69nj4\" (UniqueName: \"kubernetes.io/projected/d34761db-41bf-4e5f-bdca-8c25e281c924-kube-api-access-69nj4\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996933 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptnjm\" (UniqueName: \"kubernetes.io/projected/b27883d5-9c21-4869-b626-3fe39f007913-kube-api-access-ptnjm\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996990 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-service-ca\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997010 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-oauth-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34761db-41bf-4e5f-bdca-8c25e281c924-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-trusted-ca-bundle\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34761db-41bf-4e5f-bdca-8c25e281c924-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.005749 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34761db-41bf-4e5f-bdca-8c25e281c924-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.013191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.020941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nj4\" (UniqueName: \"kubernetes.io/projected/d34761db-41bf-4e5f-bdca-8c25e281c924-kube-api-access-69nj4\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.042031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerStarted","Data":"8d336676c584568330dc6906a42fe736bdf03c4eab298e774dfad7ffa35d7551"} Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.042871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:01 crc kubenswrapper[4795]: W0320 17:32:01.067217 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode070281f_65f5_4c6d_b012_06c027393646.slice/crio-dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6 WatchSource:0}: Error finding container dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6: Status 404 returned error can't find the container with id dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6 Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.095564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-console-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-oauth-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptnjm\" (UniqueName: \"kubernetes.io/projected/b27883d5-9c21-4869-b626-3fe39f007913-kube-api-access-ptnjm\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-service-ca\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097709 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-oauth-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097743 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-trusted-ca-bundle\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.098577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-console-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.098586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-trusted-ca-bundle\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.098630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-service-ca\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.099131 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-oauth-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.103098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-oauth-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.103921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.116116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptnjm\" (UniqueName: \"kubernetes.io/projected/b27883d5-9c21-4869-b626-3fe39f007913-kube-api-access-ptnjm\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.231457 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s"] Mar 20 17:32:01 crc kubenswrapper[4795]: W0320 17:32:01.241425 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c42497_77ba_49bc_a292_5003a353fde6.slice/crio-996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b WatchSource:0}: Error finding container 996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b: Status 404 returned error can't find the container with id 996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.263700 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq"] Mar 20 17:32:01 crc kubenswrapper[4795]: W0320 17:32:01.281180 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50011ef_d180_4d84_ba10_a2da522a579d.slice/crio-f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5 WatchSource:0}: Error finding container f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5: Status 404 returned error can't find the container with id f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5 Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.297156 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.306512 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl"] Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.700932 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5444f47d-jcv9w"] Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.049927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bsp49" event={"ID":"e070281f-65f5-4c6d-b012-06c027393646","Type":"ContainerStarted","Data":"dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.052978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" event={"ID":"d34761db-41bf-4e5f-bdca-8c25e281c924","Type":"ContainerStarted","Data":"acedf09b035200d8295f483704044692e759e05b7a842a68c484ac0d84c79f71"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.054728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" event={"ID":"65c42497-77ba-49bc-a292-5003a353fde6","Type":"ContainerStarted","Data":"996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.056717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5444f47d-jcv9w" event={"ID":"b27883d5-9c21-4869-b626-3fe39f007913","Type":"ContainerStarted","Data":"2717b812a369ff2db01ee54aae4f9812df13ad6781d579c7086699913a6bb582"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.056751 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5444f47d-jcv9w" event={"ID":"b27883d5-9c21-4869-b626-3fe39f007913","Type":"ContainerStarted","Data":"0e72f9902ef5f4ecebae0b68c201fabd21f682e9a69fe3297d68ace71703ce51"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.060510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" event={"ID":"f50011ef-d180-4d84-ba10-a2da522a579d","Type":"ContainerStarted","Data":"f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.091577 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c5444f47d-jcv9w" podStartSLOduration=2.091547919 podStartE2EDuration="2.091547919s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:32:02.083227655 +0000 UTC m=+865.541259226" watchObservedRunningTime="2026-03-20 17:32:02.091547919 +0000 UTC m=+865.549579490" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.072212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerStarted","Data":"616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20"} Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.096422 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" podStartSLOduration=1.238870298 podStartE2EDuration="3.096398445s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:00.725915691 +0000 UTC m=+864.183947222" lastFinishedPulling="2026-03-20 17:32:02.583443788 +0000 UTC m=+866.041475369" observedRunningTime="2026-03-20 17:32:03.088900618 +0000 UTC m=+866.546932199" watchObservedRunningTime="2026-03-20 17:32:03.096398445 +0000 UTC m=+866.554430016" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.154388 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.198276 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.927089 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.077218 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.077276 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.079798 4795 generic.go:334] "Generic (PLEG): container finished" podID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerID="616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20" exitCode=0 Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.079832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerDied","Data":"616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20"} Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.130467 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.086602 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ccgqp" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" containerID="cri-o://2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" gracePeriod=2 Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.131641 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.581113 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.665770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.676225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj" (OuterVolumeSpecName: "kube-api-access-wcvpj") pod "9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" (UID: "9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd"). InnerVolumeSpecName "kube-api-access-wcvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.727720 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.766900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"596dd5ef-f287-4f26-9618-c7763a911124\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.767146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"596dd5ef-f287-4f26-9618-c7763a911124\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.767260 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"596dd5ef-f287-4f26-9618-c7763a911124\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.767580 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.768060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities" (OuterVolumeSpecName: "utilities") pod "596dd5ef-f287-4f26-9618-c7763a911124" (UID: "596dd5ef-f287-4f26-9618-c7763a911124"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.770480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4" (OuterVolumeSpecName: "kube-api-access-rtnk4") pod "596dd5ef-f287-4f26-9618-c7763a911124" (UID: "596dd5ef-f287-4f26-9618-c7763a911124"). InnerVolumeSpecName "kube-api-access-rtnk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.869605 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.869642 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.902073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "596dd5ef-f287-4f26-9618-c7763a911124" (UID: "596dd5ef-f287-4f26-9618-c7763a911124"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.971022 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.093260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" event={"ID":"65c42497-77ba-49bc-a292-5003a353fde6","Type":"ContainerStarted","Data":"ea6461fdf384c7fdde5d1bc30899be010563aaeb383147b68ad8ed612f14c3f1"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.095418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" event={"ID":"f50011ef-d180-4d84-ba10-a2da522a579d","Type":"ContainerStarted","Data":"d459d76826aed1acb3ecbfc9d943cb5cb14adf4e987d31265355842277ea08c1"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.095486 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099792 4795 generic.go:334] "Generic (PLEG): container finished" podID="596dd5ef-f287-4f26-9618-c7763a911124" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" exitCode=0 Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099820 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099920 4795 scope.go:117] "RemoveContainer" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.101321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bsp49" event={"ID":"e070281f-65f5-4c6d-b012-06c027393646","Type":"ContainerStarted","Data":"a07bc285a6c4703cf239433a163e3c24d27e03d403d782478573d52aed067438"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.101828 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.108298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerDied","Data":"8d336676c584568330dc6906a42fe736bdf03c4eab298e774dfad7ffa35d7551"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.108332 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d336676c584568330dc6906a42fe736bdf03c4eab298e774dfad7ffa35d7551" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.108387 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.112213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" event={"ID":"d34761db-41bf-4e5f-bdca-8c25e281c924","Type":"ContainerStarted","Data":"c3c19082ec0adf9bc8e7dde9f17563e5d6f70e498a384dd9ebe4a25e0784f8c9"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.121818 4795 scope.go:117] "RemoveContainer" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.121372 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" podStartSLOduration=1.715865704 podStartE2EDuration="6.121350615s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.282316575 +0000 UTC m=+864.740348116" lastFinishedPulling="2026-03-20 17:32:05.687801476 +0000 UTC m=+869.145833027" observedRunningTime="2026-03-20 17:32:06.117480723 +0000 UTC m=+869.575512274" watchObservedRunningTime="2026-03-20 17:32:06.121350615 +0000 UTC m=+869.579382156" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.157514 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.160009 4795 scope.go:117] "RemoveContainer" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.171442 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.176508 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" podStartSLOduration=1.961529145 podStartE2EDuration="6.176493245s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.323246933 +0000 UTC m=+864.781278464" lastFinishedPulling="2026-03-20 17:32:05.538210983 +0000 UTC m=+868.996242564" observedRunningTime="2026-03-20 17:32:06.167572991 +0000 UTC m=+869.625604542" watchObservedRunningTime="2026-03-20 17:32:06.176493245 +0000 UTC m=+869.634524786" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.202201 4795 scope.go:117] "RemoveContainer" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" Mar 20 17:32:06 crc kubenswrapper[4795]: E0320 17:32:06.202789 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42\": container with ID starting with 2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42 not found: ID does not exist" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.202819 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42"} err="failed to get container status \"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42\": rpc error: code = NotFound desc = could not find container \"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42\": container with ID starting with 2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42 not found: ID does not exist" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.202839 4795 scope.go:117] "RemoveContainer" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" Mar 20 17:32:06 crc kubenswrapper[4795]: E0320 17:32:06.203635 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5\": container with ID starting with fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5 not found: ID does not exist" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.203666 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5"} err="failed to get container status \"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5\": rpc error: code = NotFound desc = could not find container \"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5\": container with ID starting with fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5 not found: ID does not exist" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.203710 4795 scope.go:117] "RemoveContainer" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.208288 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bsp49" podStartSLOduration=1.583703541 podStartE2EDuration="6.208265651s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.070988923 +0000 UTC m=+864.529020464" lastFinishedPulling="2026-03-20 17:32:05.695551023 +0000 UTC m=+869.153582574" observedRunningTime="2026-03-20 17:32:06.189409444 +0000 UTC m=+869.647440985" watchObservedRunningTime="2026-03-20 17:32:06.208265651 +0000 UTC m=+869.666297192" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.210728 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:32:06 crc kubenswrapper[4795]: E0320 17:32:06.213576 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d\": container with ID starting with b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d not found: ID does not exist" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.213635 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d"} err="failed to get container status \"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d\": rpc error: code = NotFound desc = could not find container \"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d\": container with ID starting with b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d not found: ID does not exist" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.214022 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.535056 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.133154 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4gbg2" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" containerID="cri-o://9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" gracePeriod=2 Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.264281 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596dd5ef-f287-4f26-9618-c7763a911124" path="/var/lib/kubelet/pods/596dd5ef-f287-4f26-9618-c7763a911124/volumes" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.265718 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740c1ddf-96e5-46f6-837c-73372748464e" path="/var/lib/kubelet/pods/740c1ddf-96e5-46f6-837c-73372748464e/volumes" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.516071 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.594934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.595004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.595030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.596474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities" (OuterVolumeSpecName: "utilities") pod "6158d7fa-4982-4817-9c29-b1f3c3fd70d7" (UID: "6158d7fa-4982-4817-9c29-b1f3c3fd70d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.604586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr" (OuterVolumeSpecName: "kube-api-access-hr5sr") pod "6158d7fa-4982-4817-9c29-b1f3c3fd70d7" (UID: "6158d7fa-4982-4817-9c29-b1f3c3fd70d7"). InnerVolumeSpecName "kube-api-access-hr5sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.672233 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6158d7fa-4982-4817-9c29-b1f3c3fd70d7" (UID: "6158d7fa-4982-4817-9c29-b1f3c3fd70d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.696674 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.696870 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.696882 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.142890 4795 generic.go:334] "Generic (PLEG): container finished" podID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" exitCode=0 Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.142966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5"} Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.142995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.143022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e"} Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.143049 4795 scope.go:117] "RemoveContainer" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.177953 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.183189 4795 scope.go:117] "RemoveContainer" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.186369 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.213750 4795 scope.go:117] "RemoveContainer" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.236291 4795 scope.go:117] "RemoveContainer" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" Mar 20 17:32:08 crc kubenswrapper[4795]: E0320 17:32:08.236867 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5\": container with ID starting with 9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5 not found: ID does not exist" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.236974 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5"} err="failed to get container status \"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5\": rpc error: code = NotFound desc = could not find container \"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5\": container with ID starting with 9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5 not found: ID does not exist" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.237028 4795 scope.go:117] "RemoveContainer" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" Mar 20 17:32:08 crc kubenswrapper[4795]: E0320 17:32:08.237482 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386\": container with ID starting with 1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386 not found: ID does not exist" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.237546 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386"} err="failed to get container status \"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386\": rpc error: code = NotFound desc = could not find container \"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386\": container with ID starting with 1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386 not found: ID does not exist" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.237587 4795 scope.go:117] "RemoveContainer" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" Mar 20 17:32:08 crc kubenswrapper[4795]: E0320 17:32:08.238096 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4\": container with ID starting with 341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4 not found: ID does not exist" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.238134 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4"} err="failed to get container status \"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4\": rpc error: code = NotFound desc = could not find container \"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4\": container with ID starting with 341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4 not found: ID does not exist" Mar 20 17:32:09 crc kubenswrapper[4795]: I0320 17:32:09.154058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" event={"ID":"65c42497-77ba-49bc-a292-5003a353fde6","Type":"ContainerStarted","Data":"f83c019099cbb0a94a45c2f70d6c8fc2b9bda454b59ae1fe43d96c2782fd7021"} Mar 20 17:32:09 crc kubenswrapper[4795]: I0320 17:32:09.270541 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" path="/var/lib/kubelet/pods/6158d7fa-4982-4817-9c29-b1f3c3fd70d7/volumes" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.086784 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.115544 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" podStartSLOduration=4.172647777 podStartE2EDuration="11.115519436s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.244449974 +0000 UTC m=+864.702481505" lastFinishedPulling="2026-03-20 17:32:08.187321623 +0000 UTC m=+871.645353164" observedRunningTime="2026-03-20 17:32:09.193535204 +0000 UTC m=+872.651566795" watchObservedRunningTime="2026-03-20 17:32:11.115519436 +0000 UTC m=+874.573551007" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.298294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.298371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.301469 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.301535 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.301587 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.302249 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.302353 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842" gracePeriod=600 Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.305490 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191457 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842" exitCode=0 Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842"} Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc"} Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191938 4795 scope.go:117] "RemoveContainer" containerID="ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f" Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.199209 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.292711 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.940446 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941434 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerName="oc" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941453 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerName="oc" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941469 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941477 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941493 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941500 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941518 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941525 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941539 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941547 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941557 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941564 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941582 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941592 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941775 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941796 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerName="oc" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941808 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.942727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.958991 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.025318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.025442 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.025524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.126878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.126935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.126978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.127547 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.127776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.161890 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.263000 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.494906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:16 crc kubenswrapper[4795]: W0320 17:32:16.501290 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7dec918_b9ad_46a4_b161_6006552b910e.slice/crio-ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08 WatchSource:0}: Error finding container ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08: Status 404 returned error can't find the container with id ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08 Mar 20 17:32:17 crc kubenswrapper[4795]: I0320 17:32:17.237064 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7dec918-b9ad-46a4-b161-6006552b910e" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" exitCode=0 Mar 20 17:32:17 crc kubenswrapper[4795]: I0320 17:32:17.237132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d"} Mar 20 17:32:17 crc kubenswrapper[4795]: I0320 17:32:17.237173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerStarted","Data":"ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08"} Mar 20 17:32:18 crc kubenswrapper[4795]: I0320 17:32:18.249290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerStarted","Data":"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733"} Mar 20 17:32:19 crc kubenswrapper[4795]: I0320 17:32:19.260133 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7dec918-b9ad-46a4-b161-6006552b910e" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" exitCode=0 Mar 20 17:32:19 crc kubenswrapper[4795]: I0320 17:32:19.271538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733"} Mar 20 17:32:21 crc kubenswrapper[4795]: I0320 17:32:21.023755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:21 crc kubenswrapper[4795]: I0320 17:32:21.278210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerStarted","Data":"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e"} Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.263977 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.264405 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.334580 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.362236 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sqkkm" podStartSLOduration=8.299844587 podStartE2EDuration="11.362218284s" podCreationTimestamp="2026-03-20 17:32:15 +0000 UTC" firstStartedPulling="2026-03-20 17:32:17.240622061 +0000 UTC m=+880.698653632" lastFinishedPulling="2026-03-20 17:32:20.302995788 +0000 UTC m=+883.761027329" observedRunningTime="2026-03-20 17:32:21.301958988 +0000 UTC m=+884.759990569" watchObservedRunningTime="2026-03-20 17:32:26.362218284 +0000 UTC m=+889.820249835" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.386985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.576712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:28 crc kubenswrapper[4795]: I0320 17:32:28.335474 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sqkkm" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" containerID="cri-o://054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" gracePeriod=2 Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.074968 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.166644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"c7dec918-b9ad-46a4-b161-6006552b910e\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.166781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"c7dec918-b9ad-46a4-b161-6006552b910e\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.168363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities" (OuterVolumeSpecName: "utilities") pod "c7dec918-b9ad-46a4-b161-6006552b910e" (UID: "c7dec918-b9ad-46a4-b161-6006552b910e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.172958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"c7dec918-b9ad-46a4-b161-6006552b910e\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.173566 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.178927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb" (OuterVolumeSpecName: "kube-api-access-t6crb") pod "c7dec918-b9ad-46a4-b161-6006552b910e" (UID: "c7dec918-b9ad-46a4-b161-6006552b910e"). InnerVolumeSpecName "kube-api-access-t6crb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.227169 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7dec918-b9ad-46a4-b161-6006552b910e" (UID: "c7dec918-b9ad-46a4-b161-6006552b910e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.274597 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.274648 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353039 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7dec918-b9ad-46a4-b161-6006552b910e" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" exitCode=0 Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e"} Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353141 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08"} Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353234 4795 scope.go:117] "RemoveContainer" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.389168 4795 scope.go:117] "RemoveContainer" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.414448 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.419405 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.423793 4795 scope.go:117] "RemoveContainer" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.450401 4795 scope.go:117] "RemoveContainer" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" Mar 20 17:32:30 crc kubenswrapper[4795]: E0320 17:32:30.450807 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e\": container with ID starting with 054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e not found: ID does not exist" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.450857 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e"} err="failed to get container status \"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e\": rpc error: code = NotFound desc = could not find container \"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e\": container with ID starting with 054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e not found: ID does not exist" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.450925 4795 scope.go:117] "RemoveContainer" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" Mar 20 17:32:30 crc kubenswrapper[4795]: E0320 17:32:30.452066 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733\": container with ID starting with 8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733 not found: ID does not exist" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.452169 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733"} err="failed to get container status \"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733\": rpc error: code = NotFound desc = could not find container \"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733\": container with ID starting with 8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733 not found: ID does not exist" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.452194 4795 scope.go:117] "RemoveContainer" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" Mar 20 17:32:30 crc kubenswrapper[4795]: E0320 17:32:30.452601 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d\": container with ID starting with 07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d not found: ID does not exist" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.452626 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d"} err="failed to get container status \"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d\": rpc error: code = NotFound desc = could not find container \"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d\": container with ID starting with 07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d not found: ID does not exist" Mar 20 17:32:31 crc kubenswrapper[4795]: I0320 17:32:31.279854 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" path="/var/lib/kubelet/pods/c7dec918-b9ad-46a4-b161-6006552b910e/volumes" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.764556 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk"] Mar 20 17:32:35 crc kubenswrapper[4795]: E0320 17:32:35.765587 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765605 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" Mar 20 17:32:35 crc kubenswrapper[4795]: E0320 17:32:35.765644 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-utilities" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765653 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-utilities" Mar 20 17:32:35 crc kubenswrapper[4795]: E0320 17:32:35.765671 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-content" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-content" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765974 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.766996 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.769472 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.780372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk"] Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.853882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.854305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.854480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.956112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.956219 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.956261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.957089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.957489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.991510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:36 crc kubenswrapper[4795]: I0320 17:32:36.094840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:36 crc kubenswrapper[4795]: I0320 17:32:36.347031 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk"] Mar 20 17:32:36 crc kubenswrapper[4795]: I0320 17:32:36.399304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerStarted","Data":"7637a791c41bad7c601738f0094c7d42302c4b652cd02e91a4b24a6955f0768a"} Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.352769 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hn4r8" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" containerID="cri-o://1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" gracePeriod=15 Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.410401 4795 generic.go:334] "Generic (PLEG): container finished" podID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerID="88c4d3a2ad80e2fc476afe69ca64363007aa94b8e2632cdb73754ffffcfc98b5" exitCode=0 Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.410470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"88c4d3a2ad80e2fc476afe69ca64363007aa94b8e2632cdb73754ffffcfc98b5"} Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.769014 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hn4r8_662f8843-e25d-48ce-989d-9ea05937757d/console/0.log" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.769124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.880603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.880756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.880818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.881887 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.881982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config" (OuterVolumeSpecName: "console-config") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882228 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882623 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.883218 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.883268 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.883926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca" (OuterVolumeSpecName: "service-ca") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.884056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.887206 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.887816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm" (OuterVolumeSpecName: "kube-api-access-5dskm") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "kube-api-access-5dskm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.891762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984398 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984453 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984462 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984470 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984479 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.421882 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hn4r8_662f8843-e25d-48ce-989d-9ea05937757d/console/0.log" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422389 4795 generic.go:334] "Generic (PLEG): container finished" podID="662f8843-e25d-48ce-989d-9ea05937757d" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" exitCode=2 Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerDied","Data":"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0"} Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422485 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerDied","Data":"c7f65d1274bb19079f9f79351a782d7495541a1ecdc8d88a866af54812721807"} Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422521 4795 scope.go:117] "RemoveContainer" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.456493 4795 scope.go:117] "RemoveContainer" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" Mar 20 17:32:38 crc kubenswrapper[4795]: E0320 17:32:38.457185 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0\": container with ID starting with 1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0 not found: ID does not exist" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.457253 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0"} err="failed to get container status \"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0\": rpc error: code = NotFound desc = could not find container \"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0\": container with ID starting with 1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0 not found: ID does not exist" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.488001 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.496230 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:32:39 crc kubenswrapper[4795]: I0320 17:32:39.265220 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="662f8843-e25d-48ce-989d-9ea05937757d" path="/var/lib/kubelet/pods/662f8843-e25d-48ce-989d-9ea05937757d/volumes" Mar 20 17:32:40 crc kubenswrapper[4795]: I0320 17:32:40.442501 4795 generic.go:334] "Generic (PLEG): container finished" podID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerID="e8e016cf54131355bcae914bb33679be4df5b923faa8aa326ed84ef75f46b216" exitCode=0 Mar 20 17:32:40 crc kubenswrapper[4795]: I0320 17:32:40.442590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"e8e016cf54131355bcae914bb33679be4df5b923faa8aa326ed84ef75f46b216"} Mar 20 17:32:41 crc kubenswrapper[4795]: I0320 17:32:41.454679 4795 generic.go:334] "Generic (PLEG): container finished" podID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerID="a4557c0a88edbaa0a8b87f75e4839dd3dcfda99e54e04b9994d601e11f01d82b" exitCode=0 Mar 20 17:32:41 crc kubenswrapper[4795]: I0320 17:32:41.455048 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"a4557c0a88edbaa0a8b87f75e4839dd3dcfda99e54e04b9994d601e11f01d82b"} Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.850938 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.957601 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.957676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.957763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.959113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle" (OuterVolumeSpecName: "bundle") pod "6d525cd0-41ce-4352-8ce0-8f24113c89d0" (UID: "6d525cd0-41ce-4352-8ce0-8f24113c89d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.964518 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st" (OuterVolumeSpecName: "kube-api-access-4m4st") pod "6d525cd0-41ce-4352-8ce0-8f24113c89d0" (UID: "6d525cd0-41ce-4352-8ce0-8f24113c89d0"). InnerVolumeSpecName "kube-api-access-4m4st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.972525 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util" (OuterVolumeSpecName: "util") pod "6d525cd0-41ce-4352-8ce0-8f24113c89d0" (UID: "6d525cd0-41ce-4352-8ce0-8f24113c89d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.060009 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.060069 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.060135 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.471424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"7637a791c41bad7c601738f0094c7d42302c4b652cd02e91a4b24a6955f0768a"} Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.471497 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7637a791c41bad7c601738f0094c7d42302c4b652cd02e91a4b24a6955f0768a" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.471566 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:49 crc kubenswrapper[4795]: I0320 17:32:49.374760 4795 scope.go:117] "RemoveContainer" containerID="ea095688dd8877661afbf85ce172a04981e2524e4cbc5e45ea0fa637fadfbc39" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.886475 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj"] Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887308 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887325 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887345 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="extract" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887353 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="extract" Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="util" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887374 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="util" Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887384 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="pull" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887392 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="pull" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887513 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="extract" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887539 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.888079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.893203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.893353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.894152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.894203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.894424 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2zg9w" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.904179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj"] Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.915359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-apiservice-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.915424 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-webhook-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.915502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mf5\" (UniqueName: \"kubernetes.io/projected/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-kube-api-access-s7mf5\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.016713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mf5\" (UniqueName: \"kubernetes.io/projected/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-kube-api-access-s7mf5\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.016770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-apiservice-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.016806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-webhook-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.026435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-apiservice-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.026442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-webhook-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.036153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mf5\" (UniqueName: \"kubernetes.io/projected/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-kube-api-access-s7mf5\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.133449 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn"] Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.134238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.137245 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.137500 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.137936 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2bb4s" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.151253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn"] Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.204513 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.387790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvn5f\" (UniqueName: \"kubernetes.io/projected/2d29ac93-da31-4834-a858-d5bd9adb28d1-kube-api-access-pvn5f\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.387828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-apiservice-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.387872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-webhook-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.492229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-webhook-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.492627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvn5f\" (UniqueName: \"kubernetes.io/projected/2d29ac93-da31-4834-a858-d5bd9adb28d1-kube-api-access-pvn5f\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.492668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-apiservice-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.497363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-apiservice-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.503267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-webhook-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.516016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvn5f\" (UniqueName: \"kubernetes.io/projected/2d29ac93-da31-4834-a858-d5bd9adb28d1-kube-api-access-pvn5f\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.682946 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj"] Mar 20 17:32:54 crc kubenswrapper[4795]: W0320 17:32:54.691303 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e8dba8d_8387_4ced_ac54_b8d5e1cf3650.slice/crio-0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631 WatchSource:0}: Error finding container 0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631: Status 404 returned error can't find the container with id 0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631 Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.749570 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:55 crc kubenswrapper[4795]: I0320 17:32:55.180104 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn"] Mar 20 17:32:55 crc kubenswrapper[4795]: W0320 17:32:55.184245 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d29ac93_da31_4834_a858_d5bd9adb28d1.slice/crio-f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce WatchSource:0}: Error finding container f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce: Status 404 returned error can't find the container with id f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce Mar 20 17:32:55 crc kubenswrapper[4795]: I0320 17:32:55.569587 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" event={"ID":"2d29ac93-da31-4834-a858-d5bd9adb28d1","Type":"ContainerStarted","Data":"f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce"} Mar 20 17:32:55 crc kubenswrapper[4795]: I0320 17:32:55.571105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" event={"ID":"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650","Type":"ContainerStarted","Data":"0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631"} Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.601738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" event={"ID":"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650","Type":"ContainerStarted","Data":"f91d74b3e73995b5f9612f408ecd9f5188f94046424f89e59f41879694428f17"} Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.602615 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.610078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" event={"ID":"2d29ac93-da31-4834-a858-d5bd9adb28d1","Type":"ContainerStarted","Data":"7db33e2ff1b9b090bbdefd1d2955608a95dcea00951a1a56f2ad252b0294020a"} Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.610896 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.678289 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" podStartSLOduration=2.234754039 podStartE2EDuration="6.678268596s" podCreationTimestamp="2026-03-20 17:32:53 +0000 UTC" firstStartedPulling="2026-03-20 17:32:54.695714285 +0000 UTC m=+918.153745826" lastFinishedPulling="2026-03-20 17:32:59.139228842 +0000 UTC m=+922.597260383" observedRunningTime="2026-03-20 17:32:59.640646083 +0000 UTC m=+923.098677624" watchObservedRunningTime="2026-03-20 17:32:59.678268596 +0000 UTC m=+923.136300137" Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.679737 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" podStartSLOduration=1.638526994 podStartE2EDuration="5.679728493s" podCreationTimestamp="2026-03-20 17:32:54 +0000 UTC" firstStartedPulling="2026-03-20 17:32:55.190715762 +0000 UTC m=+918.648747303" lastFinishedPulling="2026-03-20 17:32:59.231917261 +0000 UTC m=+922.689948802" observedRunningTime="2026-03-20 17:32:59.667331109 +0000 UTC m=+923.125362650" watchObservedRunningTime="2026-03-20 17:32:59.679728493 +0000 UTC m=+923.137760044" Mar 20 17:33:14 crc kubenswrapper[4795]: I0320 17:33:14.754304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:33:34 crc kubenswrapper[4795]: I0320 17:33:34.207073 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.001556 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-66lbd"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.020467 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.026986 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.027604 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.028298 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.028848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kv4mk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.029178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.030439 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.034079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.093315 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bl9qp"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.094212 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.096811 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.096967 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zwrjm" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.097115 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.097243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.100913 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-kvtc5"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.102136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.104070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.119723 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kvtc5"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-reloader\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-conf\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-sockets\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4vm\" (UniqueName: \"kubernetes.io/projected/a748ee28-0a26-4700-b384-3afa65b8ac9d-kube-api-access-cw4vm\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics-certs\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfrdt\" (UniqueName: \"kubernetes.io/projected/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-kube-api-access-cfrdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-startup\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-metrics-certs\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4qsm\" (UniqueName: \"kubernetes.io/projected/2ce06e1f-5454-4b85-888b-3230c0086c2e-kube-api-access-z4qsm\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-conf\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257284 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-sockets\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257417 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4vm\" (UniqueName: \"kubernetes.io/projected/a748ee28-0a26-4700-b384-3afa65b8ac9d-kube-api-access-cw4vm\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8834c8fc-36f7-41da-867f-ec5a32e25b36-metallb-excludel2\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics-certs\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbg6h\" (UniqueName: \"kubernetes.io/projected/8834c8fc-36f7-41da-867f-ec5a32e25b36-kube-api-access-wbg6h\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257625 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrdt\" (UniqueName: \"kubernetes.io/projected/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-kube-api-access-cfrdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257671 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-startup\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257722 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-cert\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-sockets\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-metrics-certs\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-reloader\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-conf\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.258204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-reloader\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.258707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-startup\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.263545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.274402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics-certs\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.274450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfrdt\" (UniqueName: \"kubernetes.io/projected/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-kube-api-access-cfrdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.276841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4vm\" (UniqueName: \"kubernetes.io/projected/a748ee28-0a26-4700-b384-3afa65b8ac9d-kube-api-access-cw4vm\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.349081 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.355528 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8834c8fc-36f7-41da-867f-ec5a32e25b36-metallb-excludel2\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359176 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbg6h\" (UniqueName: \"kubernetes.io/projected/8834c8fc-36f7-41da-867f-ec5a32e25b36-kube-api-access-wbg6h\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-cert\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-metrics-certs\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-metrics-certs\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4qsm\" (UniqueName: \"kubernetes.io/projected/2ce06e1f-5454-4b85-888b-3230c0086c2e-kube-api-access-z4qsm\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.359714 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.359825 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist podName:8834c8fc-36f7-41da-867f-ec5a32e25b36 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:35.859805885 +0000 UTC m=+959.317837416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist") pod "speaker-bl9qp" (UID: "8834c8fc-36f7-41da-867f-ec5a32e25b36") : secret "metallb-memberlist" not found Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.360002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8834c8fc-36f7-41da-867f-ec5a32e25b36-metallb-excludel2\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.366124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-cert\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.367153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-metrics-certs\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.367231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-metrics-certs\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.374727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbg6h\" (UniqueName: \"kubernetes.io/projected/8834c8fc-36f7-41da-867f-ec5a32e25b36-kube-api-access-wbg6h\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.375136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4qsm\" (UniqueName: \"kubernetes.io/projected/2ce06e1f-5454-4b85-888b-3230c0086c2e-kube-api-access-z4qsm\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.421879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.504166 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.594557 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk"] Mar 20 17:33:35 crc kubenswrapper[4795]: W0320 17:33:35.605924 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod377dbbb7_0571_40cd_9fe3_3c86fbf4f092.slice/crio-45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36 WatchSource:0}: Error finding container 45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36: Status 404 returned error can't find the container with id 45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36 Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.669179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kvtc5"] Mar 20 17:33:35 crc kubenswrapper[4795]: W0320 17:33:35.670915 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce06e1f_5454_4b85_888b_3230c0086c2e.slice/crio-432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a WatchSource:0}: Error finding container 432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a: Status 404 returned error can't find the container with id 432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.853394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"3c6895663363858ae342f3e26060bd577e242c085ed55ce1e69814251bd4289e"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.854840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" event={"ID":"377dbbb7-0571-40cd-9fe3-3c86fbf4f092","Type":"ContainerStarted","Data":"45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.857661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kvtc5" event={"ID":"2ce06e1f-5454-4b85-888b-3230c0086c2e","Type":"ContainerStarted","Data":"87543faa1add05e5db5b3c208fd9c39da9bf337e18b6b46e7bb90710e114bb28"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.857730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kvtc5" event={"ID":"2ce06e1f-5454-4b85-888b-3230c0086c2e","Type":"ContainerStarted","Data":"432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.867362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.867519 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.867620 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist podName:8834c8fc-36f7-41da-867f-ec5a32e25b36 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:36.867593768 +0000 UTC m=+960.325625329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist") pod "speaker-bl9qp" (UID: "8834c8fc-36f7-41da-867f-ec5a32e25b36") : secret "metallb-memberlist" not found Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.868653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kvtc5" event={"ID":"2ce06e1f-5454-4b85-888b-3230c0086c2e","Type":"ContainerStarted","Data":"73c816d4902fa5f36ea0b719f42fb9e5c6f702186fd0eaecdac1bcb98f0a70a7"} Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.869174 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.882293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.889439 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-kvtc5" podStartSLOduration=1.889418503 podStartE2EDuration="1.889418503s" podCreationTimestamp="2026-03-20 17:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:36.889150125 +0000 UTC m=+960.347181666" watchObservedRunningTime="2026-03-20 17:33:36.889418503 +0000 UTC m=+960.347450054" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.893368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.912874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:36 crc kubenswrapper[4795]: W0320 17:33:36.934378 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8834c8fc_36f7_41da_867f_ec5a32e25b36.slice/crio-340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409 WatchSource:0}: Error finding container 340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409: Status 404 returned error can't find the container with id 340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409 Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.876880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bl9qp" event={"ID":"8834c8fc-36f7-41da-867f-ec5a32e25b36","Type":"ContainerStarted","Data":"ecf59a1d81c53ef7c38d2ffe50195692e67f83977cd308c3bb82a0e9eda4b3be"} Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.877392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bl9qp" event={"ID":"8834c8fc-36f7-41da-867f-ec5a32e25b36","Type":"ContainerStarted","Data":"36d8927e87fcdd7c48f3fb7381aac211e7f656aebf76d5e4d14eeeb505b70d3d"} Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.877404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bl9qp" event={"ID":"8834c8fc-36f7-41da-867f-ec5a32e25b36","Type":"ContainerStarted","Data":"340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409"} Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.877585 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.900120 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bl9qp" podStartSLOduration=2.900100225 podStartE2EDuration="2.900100225s" podCreationTimestamp="2026-03-20 17:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:37.895297203 +0000 UTC m=+961.353328744" watchObservedRunningTime="2026-03-20 17:33:37.900100225 +0000 UTC m=+961.358131766" Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.914405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" event={"ID":"377dbbb7-0571-40cd-9fe3-3c86fbf4f092","Type":"ContainerStarted","Data":"ca0d998564af719a65ad2a27fe4207130063d44a3182cb718440c781f43e2879"} Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.914844 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.918442 4795 generic.go:334] "Generic (PLEG): container finished" podID="a748ee28-0a26-4700-b384-3afa65b8ac9d" containerID="9ee46ca367ff1a085d923d8850b01ab5d9eccab652d16ced57a2f7ea507d3ae5" exitCode=0 Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.918501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerDied","Data":"9ee46ca367ff1a085d923d8850b01ab5d9eccab652d16ced57a2f7ea507d3ae5"} Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.950167 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" podStartSLOduration=2.268513712 podStartE2EDuration="8.950141097s" podCreationTimestamp="2026-03-20 17:33:34 +0000 UTC" firstStartedPulling="2026-03-20 17:33:35.607441718 +0000 UTC m=+959.065473259" lastFinishedPulling="2026-03-20 17:33:42.289069103 +0000 UTC m=+965.747100644" observedRunningTime="2026-03-20 17:33:42.943536858 +0000 UTC m=+966.401568469" watchObservedRunningTime="2026-03-20 17:33:42.950141097 +0000 UTC m=+966.408172668" Mar 20 17:33:43 crc kubenswrapper[4795]: I0320 17:33:43.930638 4795 generic.go:334] "Generic (PLEG): container finished" podID="a748ee28-0a26-4700-b384-3afa65b8ac9d" containerID="98e2d9a872128eabc898525d34dbc099336810588701e0b2261f093cf837ef14" exitCode=0 Mar 20 17:33:43 crc kubenswrapper[4795]: I0320 17:33:43.930924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerDied","Data":"98e2d9a872128eabc898525d34dbc099336810588701e0b2261f093cf837ef14"} Mar 20 17:33:44 crc kubenswrapper[4795]: I0320 17:33:44.941592 4795 generic.go:334] "Generic (PLEG): container finished" podID="a748ee28-0a26-4700-b384-3afa65b8ac9d" containerID="07639cb26bcdf1ced6b6b72d8fbf3fb6114c2e3bfac1ea404f0d143cb7bf1eef" exitCode=0 Mar 20 17:33:44 crc kubenswrapper[4795]: I0320 17:33:44.941633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerDied","Data":"07639cb26bcdf1ced6b6b72d8fbf3fb6114c2e3bfac1ea404f0d143cb7bf1eef"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.429724 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"b1dde30b0c701a10860f13a910a9e8f9803b8392320e34463cb0db966deabbf1"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"03cc33862050c1fc5ae5652d429e1f638e56e81c92a19827794c18c508ff5b0a"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953289 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"109bbf73da794c2069c5d335485f4278e573bdf15539b1d75755092218da2526"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"f11a7537fed5e62acea990af82c3b5be65fb8cc112f07350709fa90157d520de"} Mar 20 17:33:46 crc kubenswrapper[4795]: I0320 17:33:46.972037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"de36b2cf48ac7ad1dd4fa320be69cbdbeab02b55529f134cda0c2a3535115728"} Mar 20 17:33:46 crc kubenswrapper[4795]: I0320 17:33:46.972089 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"8b13a57fdc77700dedba599b9a9a5d8568c31fb585e887369e3e6780299a60de"} Mar 20 17:33:46 crc kubenswrapper[4795]: I0320 17:33:46.972265 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:47 crc kubenswrapper[4795]: I0320 17:33:47.006176 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-66lbd" podStartSLOduration=6.221770913 podStartE2EDuration="13.006148246s" podCreationTimestamp="2026-03-20 17:33:34 +0000 UTC" firstStartedPulling="2026-03-20 17:33:35.503971387 +0000 UTC m=+958.962002928" lastFinishedPulling="2026-03-20 17:33:42.28834872 +0000 UTC m=+965.746380261" observedRunningTime="2026-03-20 17:33:47.00503104 +0000 UTC m=+970.463062611" watchObservedRunningTime="2026-03-20 17:33:47.006148246 +0000 UTC m=+970.464179797" Mar 20 17:33:50 crc kubenswrapper[4795]: I0320 17:33:50.349638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:50 crc kubenswrapper[4795]: I0320 17:33:50.404875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:55 crc kubenswrapper[4795]: I0320 17:33:55.354654 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:55 crc kubenswrapper[4795]: I0320 17:33:55.365546 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:56 crc kubenswrapper[4795]: I0320 17:33:56.917900 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.753522 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.754395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.756710 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-glwb4" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.756875 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.758248 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.794322 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.932194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"openstack-operator-index-bhhgs\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.033839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"openstack-operator-index-bhhgs\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.057422 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"openstack-operator-index-bhhgs\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.099620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.137180 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.138161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.139498 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.140239 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.140450 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.160557 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.240398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"auto-csr-approver-29567134-glqvv\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.343021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"auto-csr-approver-29567134-glqvv\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.357223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.359176 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"auto-csr-approver-29567134-glqvv\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.509286 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.716057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:34:00 crc kubenswrapper[4795]: W0320 17:34:00.720279 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1462264f_6c8a_4024_9465_3e7d2908ba24.slice/crio-5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9 WatchSource:0}: Error finding container 5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9: Status 404 returned error can't find the container with id 5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9 Mar 20 17:34:01 crc kubenswrapper[4795]: I0320 17:34:01.063525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerStarted","Data":"0da8b87bf441877de837f21b2b0a85c9b5aca4d6bb51944bfc828ea8a620dfe8"} Mar 20 17:34:01 crc kubenswrapper[4795]: I0320 17:34:01.064263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-glqvv" event={"ID":"1462264f-6c8a-4024-9465-3e7d2908ba24","Type":"ContainerStarted","Data":"5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9"} Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.124940 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.736704 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-b6ckg"] Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.737891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.756792 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b6ckg"] Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.893749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhs4b\" (UniqueName: \"kubernetes.io/projected/3aeffd27-d2c7-4744-8e01-07a4db74597e-kube-api-access-mhs4b\") pod \"openstack-operator-index-b6ckg\" (UID: \"3aeffd27-d2c7-4744-8e01-07a4db74597e\") " pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.995291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhs4b\" (UniqueName: \"kubernetes.io/projected/3aeffd27-d2c7-4744-8e01-07a4db74597e-kube-api-access-mhs4b\") pod \"openstack-operator-index-b6ckg\" (UID: \"3aeffd27-d2c7-4744-8e01-07a4db74597e\") " pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.025184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhs4b\" (UniqueName: \"kubernetes.io/projected/3aeffd27-d2c7-4744-8e01-07a4db74597e-kube-api-access-mhs4b\") pod \"openstack-operator-index-b6ckg\" (UID: \"3aeffd27-d2c7-4744-8e01-07a4db74597e\") " pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.089659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.100779 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerStarted","Data":"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8"} Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.100909 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-bhhgs" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" containerID="cri-o://15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" gracePeriod=2 Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.107896 4795 generic.go:334] "Generic (PLEG): container finished" podID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerID="22dcfbd2225d9c0ffa8966a0b94e82b8d86d62d5548dc394c4f180ba099a7edd" exitCode=0 Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.107971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-glqvv" event={"ID":"1462264f-6c8a-4024-9465-3e7d2908ba24","Type":"ContainerDied","Data":"22dcfbd2225d9c0ffa8966a0b94e82b8d86d62d5548dc394c4f180ba099a7edd"} Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.146437 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bhhgs" podStartSLOduration=2.40082847 podStartE2EDuration="5.146407372s" podCreationTimestamp="2026-03-20 17:33:59 +0000 UTC" firstStartedPulling="2026-03-20 17:34:00.368567392 +0000 UTC m=+983.826598933" lastFinishedPulling="2026-03-20 17:34:03.114146264 +0000 UTC m=+986.572177835" observedRunningTime="2026-03-20 17:34:04.124656085 +0000 UTC m=+987.582687686" watchObservedRunningTime="2026-03-20 17:34:04.146407372 +0000 UTC m=+987.604438953" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.326363 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b6ckg"] Mar 20 17:34:04 crc kubenswrapper[4795]: W0320 17:34:04.332963 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aeffd27_d2c7_4744_8e01_07a4db74597e.slice/crio-d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57 WatchSource:0}: Error finding container d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57: Status 404 returned error can't find the container with id d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57 Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.550339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.704259 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.711274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq" (OuterVolumeSpecName: "kube-api-access-khnnq") pod "ab99d32a-4e50-468c-8eb2-1f12db5e9981" (UID: "ab99d32a-4e50-468c-8eb2-1f12db5e9981"). InnerVolumeSpecName "kube-api-access-khnnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.805583 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.119825 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b6ckg" event={"ID":"3aeffd27-d2c7-4744-8e01-07a4db74597e","Type":"ContainerStarted","Data":"fd5b4a3d9366dedff95cc9a6adbd89a7be3445a11ae3e3045855ac21d8023dbb"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.119906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b6ckg" event={"ID":"3aeffd27-d2c7-4744-8e01-07a4db74597e","Type":"ContainerStarted","Data":"d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122448 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" exitCode=0 Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerDied","Data":"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122638 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerDied","Data":"0da8b87bf441877de837f21b2b0a85c9b5aca4d6bb51944bfc828ea8a620dfe8"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122670 4795 scope.go:117] "RemoveContainer" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.145198 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-b6ckg" podStartSLOduration=2.071242656 podStartE2EDuration="2.145169861s" podCreationTimestamp="2026-03-20 17:34:03 +0000 UTC" firstStartedPulling="2026-03-20 17:34:04.337315131 +0000 UTC m=+987.795346672" lastFinishedPulling="2026-03-20 17:34:04.411242336 +0000 UTC m=+987.869273877" observedRunningTime="2026-03-20 17:34:05.141792975 +0000 UTC m=+988.599824617" watchObservedRunningTime="2026-03-20 17:34:05.145169861 +0000 UTC m=+988.603201442" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.162624 4795 scope.go:117] "RemoveContainer" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" Mar 20 17:34:05 crc kubenswrapper[4795]: E0320 17:34:05.165165 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8\": container with ID starting with 15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8 not found: ID does not exist" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.165242 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8"} err="failed to get container status \"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8\": rpc error: code = NotFound desc = could not find container \"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8\": container with ID starting with 15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8 not found: ID does not exist" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.179387 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.187257 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.264490 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" path="/var/lib/kubelet/pods/ab99d32a-4e50-468c-8eb2-1f12db5e9981/volumes" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.438742 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.515509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"1462264f-6c8a-4024-9465-3e7d2908ba24\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.527916 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69" (OuterVolumeSpecName: "kube-api-access-lqm69") pod "1462264f-6c8a-4024-9465-3e7d2908ba24" (UID: "1462264f-6c8a-4024-9465-3e7d2908ba24"). InnerVolumeSpecName "kube-api-access-lqm69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.617786 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.133324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-glqvv" event={"ID":"1462264f-6c8a-4024-9465-3e7d2908ba24","Type":"ContainerDied","Data":"5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9"} Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.133387 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9" Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.133345 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.504250 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.513050 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:34:07 crc kubenswrapper[4795]: I0320 17:34:07.265521 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" path="/var/lib/kubelet/pods/4be9f091-42a0-432b-8f14-700bc3e733cb/volumes" Mar 20 17:34:11 crc kubenswrapper[4795]: I0320 17:34:11.299747 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:34:11 crc kubenswrapper[4795]: I0320 17:34:11.300177 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.090563 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.090643 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.136932 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.242965 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.789254 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5"] Mar 20 17:34:15 crc kubenswrapper[4795]: E0320 17:34:15.789854 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.789870 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" Mar 20 17:34:15 crc kubenswrapper[4795]: E0320 17:34:15.789882 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerName="oc" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.789890 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerName="oc" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.790023 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.790045 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerName="oc" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.791011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.793373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lvhj5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.803186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5"] Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.862104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.862214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.862306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.963623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.963795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.963899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.964552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.964552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.995768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:16 crc kubenswrapper[4795]: I0320 17:34:16.110793 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:16 crc kubenswrapper[4795]: I0320 17:34:16.616259 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5"] Mar 20 17:34:16 crc kubenswrapper[4795]: W0320 17:34:16.628009 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7571109_7ce9_44a8_9275_4af4fadbd0e6.slice/crio-346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321 WatchSource:0}: Error finding container 346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321: Status 404 returned error can't find the container with id 346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321 Mar 20 17:34:17 crc kubenswrapper[4795]: I0320 17:34:17.221548 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerID="2d67a9a380a082071bfa78d21716eb4b1e97fe88d534c7e295d5fcb7891dfacc" exitCode=0 Mar 20 17:34:17 crc kubenswrapper[4795]: I0320 17:34:17.221714 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"2d67a9a380a082071bfa78d21716eb4b1e97fe88d534c7e295d5fcb7891dfacc"} Mar 20 17:34:17 crc kubenswrapper[4795]: I0320 17:34:17.221928 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerStarted","Data":"346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321"} Mar 20 17:34:18 crc kubenswrapper[4795]: I0320 17:34:18.235196 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerID="90bee4baf4c24459457ba73d1c0b53d701cf2273d32aa38f67a9d73df0665a44" exitCode=0 Mar 20 17:34:18 crc kubenswrapper[4795]: I0320 17:34:18.235276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"90bee4baf4c24459457ba73d1c0b53d701cf2273d32aa38f67a9d73df0665a44"} Mar 20 17:34:19 crc kubenswrapper[4795]: I0320 17:34:19.244025 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerID="5e0c33eb1ff9dd81990fd915b2c9bc30c9229c45998fe36c98c1b059681a7211" exitCode=0 Mar 20 17:34:19 crc kubenswrapper[4795]: I0320 17:34:19.244088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"5e0c33eb1ff9dd81990fd915b2c9bc30c9229c45998fe36c98c1b059681a7211"} Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.536031 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.628579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.628763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.628828 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.629516 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle" (OuterVolumeSpecName: "bundle") pod "f7571109-7ce9-44a8-9275-4af4fadbd0e6" (UID: "f7571109-7ce9-44a8-9275-4af4fadbd0e6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.634807 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8" (OuterVolumeSpecName: "kube-api-access-dwhb8") pod "f7571109-7ce9-44a8-9275-4af4fadbd0e6" (UID: "f7571109-7ce9-44a8-9275-4af4fadbd0e6"). InnerVolumeSpecName "kube-api-access-dwhb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.648567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util" (OuterVolumeSpecName: "util") pod "f7571109-7ce9-44a8-9275-4af4fadbd0e6" (UID: "f7571109-7ce9-44a8-9275-4af4fadbd0e6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.729909 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.729962 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.729974 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4795]: I0320 17:34:21.263034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321"} Mar 20 17:34:21 crc kubenswrapper[4795]: I0320 17:34:21.263101 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321" Mar 20 17:34:21 crc kubenswrapper[4795]: I0320 17:34:21.263164 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074385 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j"] Mar 20 17:34:23 crc kubenswrapper[4795]: E0320 17:34:23.074918 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="pull" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074936 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="pull" Mar 20 17:34:23 crc kubenswrapper[4795]: E0320 17:34:23.074958 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="util" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074967 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="util" Mar 20 17:34:23 crc kubenswrapper[4795]: E0320 17:34:23.074985 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="extract" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074993 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="extract" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.075117 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="extract" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.075595 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.077494 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rhg5v" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.119074 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j"] Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.162604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2694\" (UniqueName: \"kubernetes.io/projected/084071f5-e58b-451b-9cf5-67203ae1ba02-kube-api-access-c2694\") pod \"openstack-operator-controller-init-65b67cc5c9-vm29j\" (UID: \"084071f5-e58b-451b-9cf5-67203ae1ba02\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.263604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2694\" (UniqueName: \"kubernetes.io/projected/084071f5-e58b-451b-9cf5-67203ae1ba02-kube-api-access-c2694\") pod \"openstack-operator-controller-init-65b67cc5c9-vm29j\" (UID: \"084071f5-e58b-451b-9cf5-67203ae1ba02\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.284574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2694\" (UniqueName: \"kubernetes.io/projected/084071f5-e58b-451b-9cf5-67203ae1ba02-kube-api-access-c2694\") pod \"openstack-operator-controller-init-65b67cc5c9-vm29j\" (UID: \"084071f5-e58b-451b-9cf5-67203ae1ba02\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.442236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.906349 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j"] Mar 20 17:34:23 crc kubenswrapper[4795]: W0320 17:34:23.914987 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod084071f5_e58b_451b_9cf5_67203ae1ba02.slice/crio-b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741 WatchSource:0}: Error finding container b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741: Status 404 returned error can't find the container with id b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741 Mar 20 17:34:24 crc kubenswrapper[4795]: I0320 17:34:24.280678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" event={"ID":"084071f5-e58b-451b-9cf5-67203ae1ba02","Type":"ContainerStarted","Data":"b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741"} Mar 20 17:34:29 crc kubenswrapper[4795]: I0320 17:34:29.325005 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" event={"ID":"084071f5-e58b-451b-9cf5-67203ae1ba02","Type":"ContainerStarted","Data":"8640157f9b5a55f800571b9991838d07d033a444ab6bc55fe715cab41c344de7"} Mar 20 17:34:29 crc kubenswrapper[4795]: I0320 17:34:29.325496 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:29 crc kubenswrapper[4795]: I0320 17:34:29.368055 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" podStartSLOduration=2.050125497 podStartE2EDuration="6.368036782s" podCreationTimestamp="2026-03-20 17:34:23 +0000 UTC" firstStartedPulling="2026-03-20 17:34:23.917832671 +0000 UTC m=+1007.375864202" lastFinishedPulling="2026-03-20 17:34:28.235743946 +0000 UTC m=+1011.693775487" observedRunningTime="2026-03-20 17:34:29.365363778 +0000 UTC m=+1012.823395319" watchObservedRunningTime="2026-03-20 17:34:29.368036782 +0000 UTC m=+1012.826068333" Mar 20 17:34:33 crc kubenswrapper[4795]: I0320 17:34:33.451243 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:41 crc kubenswrapper[4795]: I0320 17:34:41.300030 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:34:41 crc kubenswrapper[4795]: I0320 17:34:41.300753 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:34:49 crc kubenswrapper[4795]: I0320 17:34:49.534085 4795 scope.go:117] "RemoveContainer" containerID="326b8c75bc495d3f796856aa4f0f247f31974ad88ddb26ad9ca2ca9ec8cf372a" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.508581 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.510177 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.524835 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lljpv" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.547341 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.549009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g56lz\" (UniqueName: \"kubernetes.io/projected/afefdb79-bad6-4deb-904b-515174cca414-kube-api-access-g56lz\") pod \"barbican-operator-controller-manager-59bc569d95-5hzvs\" (UID: \"afefdb79-bad6-4deb-904b-515174cca414\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.558271 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.560217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.568186 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4bq59" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.571633 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.581899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.584248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-q58d8" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.599098 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.605413 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.618741 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.620898 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.622567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-84jqr" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.627899 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.629494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.632396 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kqswl" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.633718 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwmwj\" (UniqueName: \"kubernetes.io/projected/21481bba-04ec-47ce-95d0-fe27787a3d62-kube-api-access-zwmwj\") pod \"cinder-operator-controller-manager-8d58dc466-h9f9t\" (UID: \"21481bba-04ec-47ce-95d0-fe27787a3d62\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4dhn\" (UniqueName: \"kubernetes.io/projected/43804d6b-2358-46fd-bf04-26b2308f8ab0-kube-api-access-r4dhn\") pod \"designate-operator-controller-manager-588d4d986b-jgs27\" (UID: \"43804d6b-2358-46fd-bf04-26b2308f8ab0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650478 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59xc\" (UniqueName: \"kubernetes.io/projected/a957ef3d-357c-4aa4-865c-533f889257d7-kube-api-access-t59xc\") pod \"glance-operator-controller-manager-79df6bcc97-dwx6n\" (UID: \"a957ef3d-357c-4aa4-865c-533f889257d7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g56lz\" (UniqueName: \"kubernetes.io/projected/afefdb79-bad6-4deb-904b-515174cca414-kube-api-access-g56lz\") pod \"barbican-operator-controller-manager-59bc569d95-5hzvs\" (UID: \"afefdb79-bad6-4deb-904b-515174cca414\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650544 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mlgc\" (UniqueName: \"kubernetes.io/projected/4cdd16c5-b7d3-4c52-a286-f3555daf43d9-kube-api-access-4mlgc\") pod \"heat-operator-controller-manager-67dd5f86f5-rmcrf\" (UID: \"4cdd16c5-b7d3-4c52-a286-f3555daf43d9\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.651204 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.655373 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.656856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.663258 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bfff2" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.667795 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.674313 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.675455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.678230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wc277" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.679148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.679992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.687801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g56lz\" (UniqueName: \"kubernetes.io/projected/afefdb79-bad6-4deb-904b-515174cca414-kube-api-access-g56lz\") pod \"barbican-operator-controller-manager-59bc569d95-5hzvs\" (UID: \"afefdb79-bad6-4deb-904b-515174cca414\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.693067 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.695675 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.698629 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.703063 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6b7c8" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.718775 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.719557 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.724663 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qf8fp" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.727753 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-trjt4"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.729029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.738220 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.738364 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rzr9l" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmklc\" (UniqueName: \"kubernetes.io/projected/84901a7b-ddbf-47d9-954f-c167cd9cd46c-kube-api-access-fmklc\") pod \"keystone-operator-controller-manager-768b96df4c-6hsxn\" (UID: \"84901a7b-ddbf-47d9-954f-c167cd9cd46c\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmkp\" (UniqueName: \"kubernetes.io/projected/9cba9cd3-4144-4262-82a2-f2330793aae6-kube-api-access-nvmkp\") pod \"ironic-operator-controller-manager-6f787dddc9-55vp5\" (UID: \"9cba9cd3-4144-4262-82a2-f2330793aae6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mlgc\" (UniqueName: \"kubernetes.io/projected/4cdd16c5-b7d3-4c52-a286-f3555daf43d9-kube-api-access-4mlgc\") pod \"heat-operator-controller-manager-67dd5f86f5-rmcrf\" (UID: \"4cdd16c5-b7d3-4c52-a286-f3555daf43d9\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwmwj\" (UniqueName: \"kubernetes.io/projected/21481bba-04ec-47ce-95d0-fe27787a3d62-kube-api-access-zwmwj\") pod \"cinder-operator-controller-manager-8d58dc466-h9f9t\" (UID: \"21481bba-04ec-47ce-95d0-fe27787a3d62\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfzh\" (UniqueName: \"kubernetes.io/projected/7a887d91-fa86-45d2-a6be-aa7326f7d544-kube-api-access-mpfzh\") pod \"manila-operator-controller-manager-55f864c847-trjt4\" (UID: \"7a887d91-fa86-45d2-a6be-aa7326f7d544\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4dhn\" (UniqueName: \"kubernetes.io/projected/43804d6b-2358-46fd-bf04-26b2308f8ab0-kube-api-access-r4dhn\") pod \"designate-operator-controller-manager-588d4d986b-jgs27\" (UID: \"43804d6b-2358-46fd-bf04-26b2308f8ab0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vfms\" (UniqueName: \"kubernetes.io/projected/ded84ba8-d70a-4379-bc80-d142e5306cc7-kube-api-access-8vfms\") pod \"horizon-operator-controller-manager-8464cc45fb-f74p9\" (UID: \"ded84ba8-d70a-4379-bc80-d142e5306cc7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdccc\" (UniqueName: \"kubernetes.io/projected/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-kube-api-access-fdccc\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t59xc\" (UniqueName: \"kubernetes.io/projected/a957ef3d-357c-4aa4-865c-533f889257d7-kube-api-access-t59xc\") pod \"glance-operator-controller-manager-79df6bcc97-dwx6n\" (UID: \"a957ef3d-357c-4aa4-865c-533f889257d7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.758442 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-trjt4"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.763808 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.764770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.770066 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-z48z9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.775667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwmwj\" (UniqueName: \"kubernetes.io/projected/21481bba-04ec-47ce-95d0-fe27787a3d62-kube-api-access-zwmwj\") pod \"cinder-operator-controller-manager-8d58dc466-h9f9t\" (UID: \"21481bba-04ec-47ce-95d0-fe27787a3d62\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.784262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4dhn\" (UniqueName: \"kubernetes.io/projected/43804d6b-2358-46fd-bf04-26b2308f8ab0-kube-api-access-r4dhn\") pod \"designate-operator-controller-manager-588d4d986b-jgs27\" (UID: \"43804d6b-2358-46fd-bf04-26b2308f8ab0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.787902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mlgc\" (UniqueName: \"kubernetes.io/projected/4cdd16c5-b7d3-4c52-a286-f3555daf43d9-kube-api-access-4mlgc\") pod \"heat-operator-controller-manager-67dd5f86f5-rmcrf\" (UID: \"4cdd16c5-b7d3-4c52-a286-f3555daf43d9\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.798393 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.800462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59xc\" (UniqueName: \"kubernetes.io/projected/a957ef3d-357c-4aa4-865c-533f889257d7-kube-api-access-t59xc\") pod \"glance-operator-controller-manager-79df6bcc97-dwx6n\" (UID: \"a957ef3d-357c-4aa4-865c-533f889257d7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.836165 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.837017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.842099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7zpb9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.853738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfzh\" (UniqueName: \"kubernetes.io/projected/7a887d91-fa86-45d2-a6be-aa7326f7d544-kube-api-access-mpfzh\") pod \"manila-operator-controller-manager-55f864c847-trjt4\" (UID: \"7a887d91-fa86-45d2-a6be-aa7326f7d544\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gddpx\" (UniqueName: \"kubernetes.io/projected/071f0af8-4164-4f95-b0ee-720e3b3097f3-kube-api-access-gddpx\") pod \"mariadb-operator-controller-manager-67ccfc9778-jfdzb\" (UID: \"071f0af8-4164-4f95-b0ee-720e3b3097f3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854498 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vfms\" (UniqueName: \"kubernetes.io/projected/ded84ba8-d70a-4379-bc80-d142e5306cc7-kube-api-access-8vfms\") pod \"horizon-operator-controller-manager-8464cc45fb-f74p9\" (UID: \"ded84ba8-d70a-4379-bc80-d142e5306cc7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdccc\" (UniqueName: \"kubernetes.io/projected/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-kube-api-access-fdccc\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmklc\" (UniqueName: \"kubernetes.io/projected/84901a7b-ddbf-47d9-954f-c167cd9cd46c-kube-api-access-fmklc\") pod \"keystone-operator-controller-manager-768b96df4c-6hsxn\" (UID: \"84901a7b-ddbf-47d9-954f-c167cd9cd46c\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmkp\" (UniqueName: \"kubernetes.io/projected/9cba9cd3-4144-4262-82a2-f2330793aae6-kube-api-access-nvmkp\") pod \"ironic-operator-controller-manager-6f787dddc9-55vp5\" (UID: \"9cba9cd3-4144-4262-82a2-f2330793aae6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: E0320 17:34:51.854725 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:51 crc kubenswrapper[4795]: E0320 17:34:51.854770 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:52.354755457 +0000 UTC m=+1035.812786998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.865493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.875294 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.879212 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.880469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.881789 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.887257 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-glwpx" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.887731 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-d8klm" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.892779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.896806 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vfms\" (UniqueName: \"kubernetes.io/projected/ded84ba8-d70a-4379-bc80-d142e5306cc7-kube-api-access-8vfms\") pod \"horizon-operator-controller-manager-8464cc45fb-f74p9\" (UID: \"ded84ba8-d70a-4379-bc80-d142e5306cc7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.897081 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.898140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmkp\" (UniqueName: \"kubernetes.io/projected/9cba9cd3-4144-4262-82a2-f2330793aae6-kube-api-access-nvmkp\") pod \"ironic-operator-controller-manager-6f787dddc9-55vp5\" (UID: \"9cba9cd3-4144-4262-82a2-f2330793aae6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.904655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.905448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdccc\" (UniqueName: \"kubernetes.io/projected/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-kube-api-access-fdccc\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.908141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfzh\" (UniqueName: \"kubernetes.io/projected/7a887d91-fa86-45d2-a6be-aa7326f7d544-kube-api-access-mpfzh\") pod \"manila-operator-controller-manager-55f864c847-trjt4\" (UID: \"7a887d91-fa86-45d2-a6be-aa7326f7d544\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.912667 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.913255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmklc\" (UniqueName: \"kubernetes.io/projected/84901a7b-ddbf-47d9-954f-c167cd9cd46c-kube-api-access-fmklc\") pod \"keystone-operator-controller-manager-768b96df4c-6hsxn\" (UID: \"84901a7b-ddbf-47d9-954f-c167cd9cd46c\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.913906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.924413 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tshl6" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.934572 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.935427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.938472 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j696n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.938669 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.989178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.990503 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.992316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.992897 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6wb\" (UniqueName: \"kubernetes.io/projected/d4ff6977-1303-4267-983e-3e99935f2aae-kube-api-access-lk6wb\") pod \"octavia-operator-controller-manager-5b9f45d989-n7cl7\" (UID: \"d4ff6977-1303-4267-983e-3e99935f2aae\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pdhx\" (UniqueName: \"kubernetes.io/projected/0ffe016b-8919-4b8f-839c-669637b7accc-kube-api-access-9pdhx\") pod \"neutron-operator-controller-manager-767865f676-bqzcz\" (UID: \"0ffe016b-8919-4b8f-839c-669637b7accc\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gddpx\" (UniqueName: \"kubernetes.io/projected/071f0af8-4164-4f95-b0ee-720e3b3097f3-kube-api-access-gddpx\") pod \"mariadb-operator-controller-manager-67ccfc9778-jfdzb\" (UID: \"071f0af8-4164-4f95-b0ee-720e3b3097f3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spfbz\" (UniqueName: \"kubernetes.io/projected/84a19583-b173-4fb9-8b83-d9c41a5faf79-kube-api-access-spfbz\") pod \"ovn-operator-controller-manager-884679f54-dtfmz\" (UID: \"84a19583-b173-4fb9-8b83-d9c41a5faf79\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qttmq\" (UniqueName: \"kubernetes.io/projected/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-kube-api-access-qttmq\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct28n\" (UniqueName: \"kubernetes.io/projected/0da03e08-561c-4b5f-89c7-af80c8f39f54-kube-api-access-ct28n\") pod \"nova-operator-controller-manager-5d488d59fb-5v5sg\" (UID: \"0da03e08-561c-4b5f-89c7-af80c8f39f54\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.002710 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.005796 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.009540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cn87b" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.025475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gddpx\" (UniqueName: \"kubernetes.io/projected/071f0af8-4164-4f95-b0ee-720e3b3097f3-kube-api-access-gddpx\") pod \"mariadb-operator-controller-manager-67ccfc9778-jfdzb\" (UID: \"071f0af8-4164-4f95-b0ee-720e3b3097f3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.045749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.051387 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.053891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.073997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.078841 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.086331 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-828jr"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.087394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spfbz\" (UniqueName: \"kubernetes.io/projected/84a19583-b173-4fb9-8b83-d9c41a5faf79-kube-api-access-spfbz\") pod \"ovn-operator-controller-manager-884679f54-dtfmz\" (UID: \"84a19583-b173-4fb9-8b83-d9c41a5faf79\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qttmq\" (UniqueName: \"kubernetes.io/projected/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-kube-api-access-qttmq\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct28n\" (UniqueName: \"kubernetes.io/projected/0da03e08-561c-4b5f-89c7-af80c8f39f54-kube-api-access-ct28n\") pod \"nova-operator-controller-manager-5d488d59fb-5v5sg\" (UID: \"0da03e08-561c-4b5f-89c7-af80c8f39f54\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6wb\" (UniqueName: \"kubernetes.io/projected/d4ff6977-1303-4267-983e-3e99935f2aae-kube-api-access-lk6wb\") pod \"octavia-operator-controller-manager-5b9f45d989-n7cl7\" (UID: \"d4ff6977-1303-4267-983e-3e99935f2aae\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.100017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pdhx\" (UniqueName: \"kubernetes.io/projected/0ffe016b-8919-4b8f-839c-669637b7accc-kube-api-access-9pdhx\") pod \"neutron-operator-controller-manager-767865f676-bqzcz\" (UID: \"0ffe016b-8919-4b8f-839c-669637b7accc\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.100052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mckr\" (UniqueName: \"kubernetes.io/projected/b47e6216-2e29-4d58-8b0c-5970aee6307b-kube-api-access-8mckr\") pod \"placement-operator-controller-manager-5784578c99-6cw7v\" (UID: \"b47e6216-2e29-4d58-8b0c-5970aee6307b\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.100092 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.100173 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:52.600144395 +0000 UTC m=+1036.058175936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.101509 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fphs6" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.113961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-828jr"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.132916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct28n\" (UniqueName: \"kubernetes.io/projected/0da03e08-561c-4b5f-89c7-af80c8f39f54-kube-api-access-ct28n\") pod \"nova-operator-controller-manager-5d488d59fb-5v5sg\" (UID: \"0da03e08-561c-4b5f-89c7-af80c8f39f54\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.133327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spfbz\" (UniqueName: \"kubernetes.io/projected/84a19583-b173-4fb9-8b83-d9c41a5faf79-kube-api-access-spfbz\") pod \"ovn-operator-controller-manager-884679f54-dtfmz\" (UID: \"84a19583-b173-4fb9-8b83-d9c41a5faf79\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.135785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pdhx\" (UniqueName: \"kubernetes.io/projected/0ffe016b-8919-4b8f-839c-669637b7accc-kube-api-access-9pdhx\") pod \"neutron-operator-controller-manager-767865f676-bqzcz\" (UID: \"0ffe016b-8919-4b8f-839c-669637b7accc\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.136350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6wb\" (UniqueName: \"kubernetes.io/projected/d4ff6977-1303-4267-983e-3e99935f2aae-kube-api-access-lk6wb\") pod \"octavia-operator-controller-manager-5b9f45d989-n7cl7\" (UID: \"d4ff6977-1303-4267-983e-3e99935f2aae\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.140943 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.141554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qttmq\" (UniqueName: \"kubernetes.io/projected/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-kube-api-access-qttmq\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.154898 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.156019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.160252 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7779x" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.177244 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.178348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.187438 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.188969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.191489 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vvgjj" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.203828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.208322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbjs\" (UniqueName: \"kubernetes.io/projected/750d9405-0514-4876-821e-9ab1f6871e87-kube-api-access-4rbjs\") pod \"swift-operator-controller-manager-c674c5965-828jr\" (UID: \"750d9405-0514-4876-821e-9ab1f6871e87\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.208455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mckr\" (UniqueName: \"kubernetes.io/projected/b47e6216-2e29-4d58-8b0c-5970aee6307b-kube-api-access-8mckr\") pod \"placement-operator-controller-manager-5784578c99-6cw7v\" (UID: \"b47e6216-2e29-4d58-8b0c-5970aee6307b\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.210415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.218051 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.218913 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.226295 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.227270 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.230625 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lvr2r" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.237333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mckr\" (UniqueName: \"kubernetes.io/projected/b47e6216-2e29-4d58-8b0c-5970aee6307b-kube-api-access-8mckr\") pod \"placement-operator-controller-manager-5784578c99-6cw7v\" (UID: \"b47e6216-2e29-4d58-8b0c-5970aee6307b\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.284098 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.284947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.285942 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.287966 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.288079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-h5tth" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.288509 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.290478 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4549\" (UniqueName: \"kubernetes.io/projected/46248665-6f9f-46e0-8db7-6be8c47cf521-kube-api-access-q4549\") pod \"telemetry-operator-controller-manager-d6b694c5-jbwss\" (UID: \"46248665-6f9f-46e0-8db7-6be8c47cf521\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309448 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp47s\" (UniqueName: \"kubernetes.io/projected/933bcfd5-f2d1-404f-876d-1d3da597f415-kube-api-access-qp47s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6z7j5\" (UID: \"933bcfd5-f2d1-404f-876d-1d3da597f415\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzqzw\" (UniqueName: \"kubernetes.io/projected/e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38-kube-api-access-tzqzw\") pod \"test-operator-controller-manager-5c5cb9c4d7-rv5df\" (UID: \"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbjs\" (UniqueName: \"kubernetes.io/projected/750d9405-0514-4876-821e-9ab1f6871e87-kube-api-access-4rbjs\") pod \"swift-operator-controller-manager-c674c5965-828jr\" (UID: \"750d9405-0514-4876-821e-9ab1f6871e87\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.327181 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbjs\" (UniqueName: \"kubernetes.io/projected/750d9405-0514-4876-821e-9ab1f6871e87-kube-api-access-4rbjs\") pod \"swift-operator-controller-manager-c674c5965-828jr\" (UID: \"750d9405-0514-4876-821e-9ab1f6871e87\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.350550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4549\" (UniqueName: \"kubernetes.io/projected/46248665-6f9f-46e0-8db7-6be8c47cf521-kube-api-access-q4549\") pod \"telemetry-operator-controller-manager-d6b694c5-jbwss\" (UID: \"46248665-6f9f-46e0-8db7-6be8c47cf521\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp47s\" (UniqueName: \"kubernetes.io/projected/933bcfd5-f2d1-404f-876d-1d3da597f415-kube-api-access-qp47s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6z7j5\" (UID: \"933bcfd5-f2d1-404f-876d-1d3da597f415\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410682 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzqzw\" (UniqueName: \"kubernetes.io/projected/e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38-kube-api-access-tzqzw\") pod \"test-operator-controller-manager-5c5cb9c4d7-rv5df\" (UID: \"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvzm\" (UniqueName: \"kubernetes.io/projected/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-kube-api-access-grvzm\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.411344 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.411391 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.411376244 +0000 UTC m=+1036.869407785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.437418 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.442100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzqzw\" (UniqueName: \"kubernetes.io/projected/e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38-kube-api-access-tzqzw\") pod \"test-operator-controller-manager-5c5cb9c4d7-rv5df\" (UID: \"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.452622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp47s\" (UniqueName: \"kubernetes.io/projected/933bcfd5-f2d1-404f-876d-1d3da597f415-kube-api-access-qp47s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6z7j5\" (UID: \"933bcfd5-f2d1-404f-876d-1d3da597f415\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.452635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4549\" (UniqueName: \"kubernetes.io/projected/46248665-6f9f-46e0-8db7-6be8c47cf521-kube-api-access-q4549\") pod \"telemetry-operator-controller-manager-d6b694c5-jbwss\" (UID: \"46248665-6f9f-46e0-8db7-6be8c47cf521\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.466399 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.504150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.513386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grvzm\" (UniqueName: \"kubernetes.io/projected/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-kube-api-access-grvzm\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.513458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.513518 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.513708 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.513764 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.013745816 +0000 UTC m=+1036.471777357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.514167 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.514197 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.01418772 +0000 UTC m=+1036.472219261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.534880 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.539547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.544641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvzm\" (UniqueName: \"kubernetes.io/projected/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-kube-api-access-grvzm\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.557011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.614425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.615042 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.615122 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.615103567 +0000 UTC m=+1037.073135108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.666909 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.674272 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.682431 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.847976 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf"] Mar 20 17:34:52 crc kubenswrapper[4795]: W0320 17:34:52.857053 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cdd16c5_b7d3_4c52_a286_f3555daf43d9.slice/crio-2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b WatchSource:0}: Error finding container 2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b: Status 404 returned error can't find the container with id 2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.888228 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.899546 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.905988 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz"] Mar 20 17:34:52 crc kubenswrapper[4795]: W0320 17:34:52.914360 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ffe016b_8919_4b8f_839c_669637b7accc.slice/crio-36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82 WatchSource:0}: Error finding container 36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82: Status 404 returned error can't find the container with id 36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82 Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.034482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.034564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034699 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034783 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034786 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:54.034767569 +0000 UTC m=+1037.492799100 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034857 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:54.034836842 +0000 UTC m=+1037.492868493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.059360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.068368 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb"] Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.075727 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4ff6977_1303_4267_983e_3e99935f2aae.slice/crio-34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272 WatchSource:0}: Error finding container 34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272: Status 404 returned error can't find the container with id 34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272 Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.086762 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.092821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.100115 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz"] Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.103261 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gddpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-jfdzb_openstack-operators(071f0af8-4164-4f95-b0ee-720e3b3097f3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.103344 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mpfzh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-trjt4_openstack-operators(7a887d91-fa86-45d2-a6be-aa7326f7d544): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.104830 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a19583_b173_4fb9_8b83_d9c41a5faf79.slice/crio-86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c WatchSource:0}: Error finding container 86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c: Status 404 returned error can't find the container with id 86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.104873 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podUID="7a887d91-fa86-45d2-a6be-aa7326f7d544" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.104907 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podUID="071f0af8-4164-4f95-b0ee-720e3b3097f3" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.106620 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-trjt4"] Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.109059 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-spfbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-dtfmz_openstack-operators(84a19583-b173-4fb9-8b83-d9c41a5faf79): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.110191 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podUID="84a19583-b173-4fb9-8b83-d9c41a5faf79" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.266152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.273713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-828jr"] Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.276651 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46248665_6f9f_46e0_8db7_6be8c47cf521.slice/crio-5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a WatchSource:0}: Error finding container 5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a: Status 404 returned error can't find the container with id 5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.277314 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod933bcfd5_f2d1_404f_876d_1d3da597f415.slice/crio-8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc WatchSource:0}: Error finding container 8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc: Status 404 returned error can't find the container with id 8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.278402 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q4549,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-jbwss_openstack-operators(46248665-6f9f-46e0-8db7-6be8c47cf521): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.279603 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5"] Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.279754 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podUID="46248665-6f9f-46e0-8db7-6be8c47cf521" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.279735 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qp47s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-6z7j5_openstack-operators(933bcfd5-f2d1-404f-876d-1d3da597f415): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.281055 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podUID="933bcfd5-f2d1-404f-876d-1d3da597f415" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.354121 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df"] Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.367402 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13cbad4_3a2f_4b3c_82d8_c3984c5a9f38.slice/crio-4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6 WatchSource:0}: Error finding container 4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6: Status 404 returned error can't find the container with id 4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6 Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.443359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.443553 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.444054 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:55.444034424 +0000 UTC m=+1038.902065965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.484049 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" event={"ID":"071f0af8-4164-4f95-b0ee-720e3b3097f3","Type":"ContainerStarted","Data":"4cabaf11a79fb91abe478b428ede08dbdc503226c0356910eb3929edc7880bbc"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.485365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podUID="071f0af8-4164-4f95-b0ee-720e3b3097f3" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.486937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" event={"ID":"933bcfd5-f2d1-404f-876d-1d3da597f415","Type":"ContainerStarted","Data":"8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.489406 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podUID="933bcfd5-f2d1-404f-876d-1d3da597f415" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.489863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" event={"ID":"750d9405-0514-4876-821e-9ab1f6871e87","Type":"ContainerStarted","Data":"85cc59475d6cbb9466309ccdd9b0cf283924bdd8183fe824ba562e2dca74278d"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.491613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" event={"ID":"0ffe016b-8919-4b8f-839c-669637b7accc","Type":"ContainerStarted","Data":"36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.496497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" event={"ID":"9cba9cd3-4144-4262-82a2-f2330793aae6","Type":"ContainerStarted","Data":"f5b05db5be497c5a21296390e1a3cbff935b25a2884fcf202309945caad66f20"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.503595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" event={"ID":"7a887d91-fa86-45d2-a6be-aa7326f7d544","Type":"ContainerStarted","Data":"cd30e6659db0586c6042a15cd77a7a06d5704aab30efe088dada386528984038"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.504940 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podUID="7a887d91-fa86-45d2-a6be-aa7326f7d544" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.505227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" event={"ID":"afefdb79-bad6-4deb-904b-515174cca414","Type":"ContainerStarted","Data":"2f3a456e174ab8c426e180d784e33c498cc2bde40a390a29114ce9a43c92dc8f"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.509146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" event={"ID":"b47e6216-2e29-4d58-8b0c-5970aee6307b","Type":"ContainerStarted","Data":"737e1ba361aadea472d34b641c5f02bf98bbf03ae64cee976c494a4d8a141b4f"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.510270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" event={"ID":"0da03e08-561c-4b5f-89c7-af80c8f39f54","Type":"ContainerStarted","Data":"673a499c42bd510dfbf994dd82475e1399cdc7a7713ed5e4d6aaf6b083417661"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.513278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" event={"ID":"d4ff6977-1303-4267-983e-3e99935f2aae","Type":"ContainerStarted","Data":"34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.514919 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" event={"ID":"a957ef3d-357c-4aa4-865c-533f889257d7","Type":"ContainerStarted","Data":"5cb13b1a6325a4b0a0726c57569fa8a23c806ea420c213c17966bbbbc30c65ae"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.517676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" event={"ID":"ded84ba8-d70a-4379-bc80-d142e5306cc7","Type":"ContainerStarted","Data":"5d134f13c7cca2bcc01cdf1ec614bc32120abb5f48e12b0f8c7864d98d632ba3"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.519581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" event={"ID":"43804d6b-2358-46fd-bf04-26b2308f8ab0","Type":"ContainerStarted","Data":"59bacb9f3e843e29b92fd9a0dc2487c591a4c65c27efe147724842748fa8562d"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.530971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" event={"ID":"21481bba-04ec-47ce-95d0-fe27787a3d62","Type":"ContainerStarted","Data":"22226aa722815ae1f8e9787887d926498e3b5e994cde1ce60fe7afd0d17b2fbc"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.533351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" event={"ID":"4cdd16c5-b7d3-4c52-a286-f3555daf43d9","Type":"ContainerStarted","Data":"2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.535169 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" event={"ID":"84901a7b-ddbf-47d9-954f-c167cd9cd46c","Type":"ContainerStarted","Data":"dd3a72b63d60bc17f484898dc47dca0889e536ec15f9cf9c214941e06449316c"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.536345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" event={"ID":"46248665-6f9f-46e0-8db7-6be8c47cf521","Type":"ContainerStarted","Data":"5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.537760 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podUID="46248665-6f9f-46e0-8db7-6be8c47cf521" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.537943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" event={"ID":"84a19583-b173-4fb9-8b83-d9c41a5faf79","Type":"ContainerStarted","Data":"86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.538996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" event={"ID":"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38","Type":"ContainerStarted","Data":"4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.539207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podUID="84a19583-b173-4fb9-8b83-d9c41a5faf79" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.646730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.646891 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.646975 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:55.646954252 +0000 UTC m=+1039.104985853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: I0320 17:34:54.050567 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:54 crc kubenswrapper[4795]: I0320 17:34:54.050676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050745 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050813 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:56.050796564 +0000 UTC m=+1039.508828105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050824 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050896 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:56.050878157 +0000 UTC m=+1039.508909698 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.552060 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podUID="46248665-6f9f-46e0-8db7-6be8c47cf521" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.552456 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podUID="84a19583-b173-4fb9-8b83-d9c41a5faf79" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.552929 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podUID="7a887d91-fa86-45d2-a6be-aa7326f7d544" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.553755 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podUID="071f0af8-4164-4f95-b0ee-720e3b3097f3" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.560207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podUID="933bcfd5-f2d1-404f-876d-1d3da597f415" Mar 20 17:34:55 crc kubenswrapper[4795]: I0320 17:34:55.470855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.471118 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.471457 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:59.471426497 +0000 UTC m=+1042.929458088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:55 crc kubenswrapper[4795]: I0320 17:34:55.674603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.674834 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.674945 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:59.674916192 +0000 UTC m=+1043.132947763 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: I0320 17:34:56.255212 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:56 crc kubenswrapper[4795]: I0320 17:34:56.255444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.255590 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.256084 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:00.256057875 +0000 UTC m=+1043.714089416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.255667 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.256550 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:00.25653864 +0000 UTC m=+1043.714570181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: I0320 17:34:59.517826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.518048 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.518231 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:07.51821324 +0000 UTC m=+1050.976244791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: I0320 17:34:59.720238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.720470 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.720582 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:07.720559899 +0000 UTC m=+1051.178591460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: I0320 17:35:00.329070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329141 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329447 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:08.329395846 +0000 UTC m=+1051.787427427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: I0320 17:35:00.329780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329871 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329914 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:08.329901892 +0000 UTC m=+1051.787933433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.357719 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.358572 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fmklc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-6hsxn_openstack-operators(84901a7b-ddbf-47d9-954f-c167cd9cd46c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.359732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" podUID="84901a7b-ddbf-47d9-954f-c167cd9cd46c" Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.658555 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" podUID="84901a7b-ddbf-47d9-954f-c167cd9cd46c" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.646360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" event={"ID":"ded84ba8-d70a-4379-bc80-d142e5306cc7","Type":"ContainerStarted","Data":"f0111d95524df63d092078718c2b8a07e14489ced1dc928f4a37eb94743523f9"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.646786 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.648331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" event={"ID":"750d9405-0514-4876-821e-9ab1f6871e87","Type":"ContainerStarted","Data":"2a4ceef1025241bf7813f32b31cfbacb69561fa89646d59611274bee301e0401"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.648458 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.650883 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" event={"ID":"a957ef3d-357c-4aa4-865c-533f889257d7","Type":"ContainerStarted","Data":"30abe1dd0f5e01ba987f9078ad4b3e27b3791baf5cde76eb547bde20295fbe43"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.651031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.653363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" event={"ID":"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38","Type":"ContainerStarted","Data":"fc7412ccaabdbde63c329639200c1eb2ed5b06f046a381f482af3feaac0a1e39"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.653438 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.654444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" event={"ID":"21481bba-04ec-47ce-95d0-fe27787a3d62","Type":"ContainerStarted","Data":"3ee9b8b9b039c452711c96b2f53386f9707dd961532dde1cc7c8927377e9c1d1"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.654505 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.658579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" event={"ID":"4cdd16c5-b7d3-4c52-a286-f3555daf43d9","Type":"ContainerStarted","Data":"77bf96924589b3fed6d330e5205405e198cf9640b6c8198a58ad91fa24fe6656"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.659098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.661487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" event={"ID":"b47e6216-2e29-4d58-8b0c-5970aee6307b","Type":"ContainerStarted","Data":"04685a046ace1ea9da0b1d9c67dfbec679438d93bb48314c6623e6fcfd082f5e"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.661874 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.663248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" event={"ID":"0da03e08-561c-4b5f-89c7-af80c8f39f54","Type":"ContainerStarted","Data":"4ffd2799e63824e453db8e53879d5a9b721bc69e1b92017ad01c21c56ba64b6a"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.663769 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.666073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" event={"ID":"9cba9cd3-4144-4262-82a2-f2330793aae6","Type":"ContainerStarted","Data":"87b55f790e0a50388374581f9ef8fa9590a309cf494bfbe7676033e0dc4d888e"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.666453 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.670042 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" event={"ID":"43804d6b-2358-46fd-bf04-26b2308f8ab0","Type":"ContainerStarted","Data":"f2389ea7871661d9bbdb0841ab68938a9508a3c849280c59cf93da116aa16e0e"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.670172 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.684892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" event={"ID":"afefdb79-bad6-4deb-904b-515174cca414","Type":"ContainerStarted","Data":"16661efb6b481c698cededa1a48a6989cae35f19d2763d45838111ab0f197459"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.684998 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.687268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" event={"ID":"d4ff6977-1303-4267-983e-3e99935f2aae","Type":"ContainerStarted","Data":"e4898a17581b68a6af81ca0110fc8504408b90f6239f76b0c7055b50073890a2"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.687363 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.688379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" event={"ID":"0ffe016b-8919-4b8f-839c-669637b7accc","Type":"ContainerStarted","Data":"0b049817003f40f212055703b53b1d9a99200e2dd68a291a3c37a3e57f322917"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.688511 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.705213 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" podStartSLOduration=3.020427446 podStartE2EDuration="15.705195096s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.699985958 +0000 UTC m=+1036.158017499" lastFinishedPulling="2026-03-20 17:35:05.384753608 +0000 UTC m=+1048.842785149" observedRunningTime="2026-03-20 17:35:06.681177458 +0000 UTC m=+1050.139208999" watchObservedRunningTime="2026-03-20 17:35:06.705195096 +0000 UTC m=+1050.163226637" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.706393 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" podStartSLOduration=3.059385927 podStartE2EDuration="15.706388634s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.680908406 +0000 UTC m=+1036.138939937" lastFinishedPulling="2026-03-20 17:35:05.327911103 +0000 UTC m=+1048.785942644" observedRunningTime="2026-03-20 17:35:06.701303763 +0000 UTC m=+1050.159335304" watchObservedRunningTime="2026-03-20 17:35:06.706388634 +0000 UTC m=+1050.164420175" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.726817 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" podStartSLOduration=3.445167239 podStartE2EDuration="15.726803759s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.101087314 +0000 UTC m=+1036.559118855" lastFinishedPulling="2026-03-20 17:35:05.382723814 +0000 UTC m=+1048.840755375" observedRunningTime="2026-03-20 17:35:06.72306164 +0000 UTC m=+1050.181093171" watchObservedRunningTime="2026-03-20 17:35:06.726803759 +0000 UTC m=+1050.184835300" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.748046 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" podStartSLOduration=3.316700342 podStartE2EDuration="15.748029159s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.901536513 +0000 UTC m=+1036.359568054" lastFinishedPulling="2026-03-20 17:35:05.33286532 +0000 UTC m=+1048.790896871" observedRunningTime="2026-03-20 17:35:06.745198299 +0000 UTC m=+1050.203229840" watchObservedRunningTime="2026-03-20 17:35:06.748029159 +0000 UTC m=+1050.206060700" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.810500 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" podStartSLOduration=3.298114225 podStartE2EDuration="15.810483051s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.858836864 +0000 UTC m=+1036.316868405" lastFinishedPulling="2026-03-20 17:35:05.37120568 +0000 UTC m=+1048.829237231" observedRunningTime="2026-03-20 17:35:06.809301403 +0000 UTC m=+1050.267332944" watchObservedRunningTime="2026-03-20 17:35:06.810483051 +0000 UTC m=+1050.268514592" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.810789 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" podStartSLOduration=3.803452324 podStartE2EDuration="15.810786001s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.370694358 +0000 UTC m=+1036.828725899" lastFinishedPulling="2026-03-20 17:35:05.378028035 +0000 UTC m=+1048.836059576" observedRunningTime="2026-03-20 17:35:06.774847415 +0000 UTC m=+1050.232878956" watchObservedRunningTime="2026-03-20 17:35:06.810786001 +0000 UTC m=+1050.268817542" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.831009 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" podStartSLOduration=3.757954896 podStartE2EDuration="15.830991938s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.271614549 +0000 UTC m=+1036.729646090" lastFinishedPulling="2026-03-20 17:35:05.344651591 +0000 UTC m=+1048.802683132" observedRunningTime="2026-03-20 17:35:06.825216366 +0000 UTC m=+1050.283247907" watchObservedRunningTime="2026-03-20 17:35:06.830991938 +0000 UTC m=+1050.289023479" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.848382 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" podStartSLOduration=3.581339759 podStartE2EDuration="15.848366607s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.102983774 +0000 UTC m=+1036.561015315" lastFinishedPulling="2026-03-20 17:35:05.370010622 +0000 UTC m=+1048.828042163" observedRunningTime="2026-03-20 17:35:06.842002446 +0000 UTC m=+1050.300033987" watchObservedRunningTime="2026-03-20 17:35:06.848366607 +0000 UTC m=+1050.306398138" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.860751 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" podStartSLOduration=3.126200065 podStartE2EDuration="15.860733588s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.598355068 +0000 UTC m=+1036.056386609" lastFinishedPulling="2026-03-20 17:35:05.332888591 +0000 UTC m=+1048.790920132" observedRunningTime="2026-03-20 17:35:06.856153143 +0000 UTC m=+1050.314184694" watchObservedRunningTime="2026-03-20 17:35:06.860733588 +0000 UTC m=+1050.318765129" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.874887 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" podStartSLOduration=3.093441453 podStartE2EDuration="15.874875495s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.563494728 +0000 UTC m=+1036.021526269" lastFinishedPulling="2026-03-20 17:35:05.34492874 +0000 UTC m=+1048.802960311" observedRunningTime="2026-03-20 17:35:06.873458669 +0000 UTC m=+1050.331490210" watchObservedRunningTime="2026-03-20 17:35:06.874875495 +0000 UTC m=+1050.332907026" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.925786 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" podStartSLOduration=3.61365309 podStartE2EDuration="15.925771452s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.08163034 +0000 UTC m=+1036.539661901" lastFinishedPulling="2026-03-20 17:35:05.393748722 +0000 UTC m=+1048.851780263" observedRunningTime="2026-03-20 17:35:06.908413744 +0000 UTC m=+1050.366445275" watchObservedRunningTime="2026-03-20 17:35:06.925771452 +0000 UTC m=+1050.383802993" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.926927 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" podStartSLOduration=3.509807539 podStartE2EDuration="15.926922878s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.915745241 +0000 UTC m=+1036.373776782" lastFinishedPulling="2026-03-20 17:35:05.33286058 +0000 UTC m=+1048.790892121" observedRunningTime="2026-03-20 17:35:06.924239013 +0000 UTC m=+1050.382270554" watchObservedRunningTime="2026-03-20 17:35:06.926922878 +0000 UTC m=+1050.384954419" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.948198 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" podStartSLOduration=3.2984648659999998 podStartE2EDuration="15.94818447s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.721266889 +0000 UTC m=+1036.179298430" lastFinishedPulling="2026-03-20 17:35:05.370986453 +0000 UTC m=+1048.829018034" observedRunningTime="2026-03-20 17:35:06.945234557 +0000 UTC m=+1050.403266088" watchObservedRunningTime="2026-03-20 17:35:06.94818447 +0000 UTC m=+1050.406216011" Mar 20 17:35:07 crc kubenswrapper[4795]: I0320 17:35:07.579126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.579360 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.579470 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:23.579440684 +0000 UTC m=+1067.037472265 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:35:07 crc kubenswrapper[4795]: I0320 17:35:07.781949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.783233 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.783288 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:23.78327148 +0000 UTC m=+1067.241303031 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: I0320 17:35:08.388232 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.388388 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.388790 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:24.388765321 +0000 UTC m=+1067.846796902 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: I0320 17:35:08.389441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.389570 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.389634 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:24.389618508 +0000 UTC m=+1067.847650079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.299966 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.300476 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.300554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.301663 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.301825 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc" gracePeriod=600 Mar 20 17:35:11 crc kubenswrapper[4795]: E0320 17:35:11.506116 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8702afd1_abd3_42d0_91e6_048802e98829.slice/crio-conmon-f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.733019 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc" exitCode=0 Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.733074 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc"} Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.733109 4795 scope.go:117] "RemoveContainer" containerID="c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.870625 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.885338 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.907383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.997244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.998556 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.999508 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.063293 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.181492 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.217568 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.230867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.353245 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.440956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.537571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.784149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" event={"ID":"071f0af8-4164-4f95-b0ee-720e3b3097f3","Type":"ContainerStarted","Data":"08aa33d7d50ce89e706ba6ddaecbe433defc8310c802c09b18ddb8d59573c8d5"} Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.785109 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.789333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd"} Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.792491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" event={"ID":"7a887d91-fa86-45d2-a6be-aa7326f7d544","Type":"ContainerStarted","Data":"7b216096aef2bce4b6237b5c7e8583113b01018eeee356898b18536631190edc"} Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.792928 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.820598 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podStartSLOduration=2.845517042 podStartE2EDuration="24.820578279s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.103196001 +0000 UTC m=+1036.561227542" lastFinishedPulling="2026-03-20 17:35:15.078257238 +0000 UTC m=+1058.536288779" observedRunningTime="2026-03-20 17:35:15.818054199 +0000 UTC m=+1059.276085750" watchObservedRunningTime="2026-03-20 17:35:15.820578279 +0000 UTC m=+1059.278609820" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.821846 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podStartSLOduration=2.846004498 podStartE2EDuration="24.821840119s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.103132759 +0000 UTC m=+1036.561164300" lastFinishedPulling="2026-03-20 17:35:15.07896838 +0000 UTC m=+1058.536999921" observedRunningTime="2026-03-20 17:35:15.801899209 +0000 UTC m=+1059.259930760" watchObservedRunningTime="2026-03-20 17:35:15.821840119 +0000 UTC m=+1059.279871650" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.824117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" event={"ID":"46248665-6f9f-46e0-8db7-6be8c47cf521","Type":"ContainerStarted","Data":"57ca24c7cdcdd3ae226b34644776b3fe2e822c9dc0a38813764c20e3a03aa9c9"} Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.824647 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.826663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" event={"ID":"84a19583-b173-4fb9-8b83-d9c41a5faf79","Type":"ContainerStarted","Data":"5bd228d6ce97c7a60b4988a6ee3699becc486c8687f6bd8bafd4edcb8718562e"} Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.827001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.828675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" event={"ID":"933bcfd5-f2d1-404f-876d-1d3da597f415","Type":"ContainerStarted","Data":"dccf44000382af16a69bc6c4de3ec06749846b5a3dd292e472c90cab8ba57ef8"} Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.828927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.854858 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podStartSLOduration=3.378006848 podStartE2EDuration="27.854835008s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.278264769 +0000 UTC m=+1036.736296320" lastFinishedPulling="2026-03-20 17:35:17.755092939 +0000 UTC m=+1061.213124480" observedRunningTime="2026-03-20 17:35:18.84732041 +0000 UTC m=+1062.305351951" watchObservedRunningTime="2026-03-20 17:35:18.854835008 +0000 UTC m=+1062.312866579" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.870773 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podStartSLOduration=2.397777453 podStartE2EDuration="26.87075417s" podCreationTimestamp="2026-03-20 17:34:52 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.279451667 +0000 UTC m=+1036.737483218" lastFinishedPulling="2026-03-20 17:35:17.752428394 +0000 UTC m=+1061.210459935" observedRunningTime="2026-03-20 17:35:18.866139284 +0000 UTC m=+1062.324170825" watchObservedRunningTime="2026-03-20 17:35:18.87075417 +0000 UTC m=+1062.328785711" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.896349 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podStartSLOduration=3.216404215 podStartE2EDuration="27.896330708s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.108931042 +0000 UTC m=+1036.566962583" lastFinishedPulling="2026-03-20 17:35:17.788857535 +0000 UTC m=+1061.246889076" observedRunningTime="2026-03-20 17:35:18.891818485 +0000 UTC m=+1062.349850106" watchObservedRunningTime="2026-03-20 17:35:18.896330708 +0000 UTC m=+1062.354362249" Mar 20 17:35:20 crc kubenswrapper[4795]: I0320 17:35:20.844576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" event={"ID":"84901a7b-ddbf-47d9-954f-c167cd9cd46c","Type":"ContainerStarted","Data":"003cafa21df19b3cddfac02d428915e46f4c5ad948b67de03d0d9f02908a76cb"} Mar 20 17:35:20 crc kubenswrapper[4795]: I0320 17:35:20.845229 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:35:20 crc kubenswrapper[4795]: I0320 17:35:20.867205 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" podStartSLOduration=3.099653678 podStartE2EDuration="29.867182595s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.905763816 +0000 UTC m=+1036.363795357" lastFinishedPulling="2026-03-20 17:35:19.673292703 +0000 UTC m=+1063.131324274" observedRunningTime="2026-03-20 17:35:20.863169068 +0000 UTC m=+1064.321200639" watchObservedRunningTime="2026-03-20 17:35:20.867182595 +0000 UTC m=+1064.325214146" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.076964 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.144762 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.289901 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.506588 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.559761 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.646612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.655141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.817923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.848590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.854265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.060043 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh"] Mar 20 17:35:24 crc kubenswrapper[4795]: W0320 17:35:24.065894 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0f2e63_50dd_424e_af01_3d09c9edd5b3.slice/crio-3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d WatchSource:0}: Error finding container 3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d: Status 404 returned error can't find the container with id 3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.103893 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.459936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.460050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.467105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.467105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.557043 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq"] Mar 20 17:35:24 crc kubenswrapper[4795]: W0320 17:35:24.570963 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0fa84d9_bfa8_4b4a_82d7_51e5ae87e0d2.slice/crio-3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194 WatchSource:0}: Error finding container 3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194: Status 404 returned error can't find the container with id 3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194 Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.739927 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.875237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" event={"ID":"fc0f2e63-50dd-424e-af01-3d09c9edd5b3","Type":"ContainerStarted","Data":"3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d"} Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.876202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" event={"ID":"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2","Type":"ContainerStarted","Data":"3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194"} Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.001890 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft"] Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.884621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" event={"ID":"0d8b26db-957e-4c0e-bb22-42f12d5beb0b","Type":"ContainerStarted","Data":"8bbd8622b4d04376a565d7c90ace4706045eef4cd55459181da6b771c44772d3"} Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.884939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" event={"ID":"0d8b26db-957e-4c0e-bb22-42f12d5beb0b","Type":"ContainerStarted","Data":"c93cffcb3922186ecbb0e79bbbca9058705018e87e04677df7aaa4e3ff73a1ce"} Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.884953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.918887 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" podStartSLOduration=33.918868231 podStartE2EDuration="33.918868231s" podCreationTimestamp="2026-03-20 17:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:25.909733554 +0000 UTC m=+1069.367765095" watchObservedRunningTime="2026-03-20 17:35:25.918868231 +0000 UTC m=+1069.376899772" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.901601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" event={"ID":"fc0f2e63-50dd-424e-af01-3d09c9edd5b3","Type":"ContainerStarted","Data":"cb585f5bbb6c67204cf96966227bb2b3b84d639644885462b81b157f3f3de829"} Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.902056 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.903504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" event={"ID":"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2","Type":"ContainerStarted","Data":"6cb9b7e64562f4a00861acdd8a538d1b881bacc3e584125bb799f3c0d02c4d80"} Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.903868 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.931890 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" podStartSLOduration=34.238478526 podStartE2EDuration="36.931866039s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:35:24.067869109 +0000 UTC m=+1067.525900660" lastFinishedPulling="2026-03-20 17:35:26.761256622 +0000 UTC m=+1070.219288173" observedRunningTime="2026-03-20 17:35:27.92557803 +0000 UTC m=+1071.383609601" watchObservedRunningTime="2026-03-20 17:35:27.931866039 +0000 UTC m=+1071.389897610" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.952029 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" podStartSLOduration=34.750588686 podStartE2EDuration="36.952009025s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:35:24.57330806 +0000 UTC m=+1068.031339641" lastFinishedPulling="2026-03-20 17:35:26.774728449 +0000 UTC m=+1070.232759980" observedRunningTime="2026-03-20 17:35:27.948444502 +0000 UTC m=+1071.406476083" watchObservedRunningTime="2026-03-20 17:35:27.952009025 +0000 UTC m=+1071.410040586" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.076661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.887296 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.890894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.892811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.998620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.998781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.998828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100924 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.101158 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.120605 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.222790 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: W0320 17:35:33.671919 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a28038_950d_4a3f_bbac_2084cf8e0ac4.slice/crio-341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595 WatchSource:0}: Error finding container 341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595: Status 404 returned error can't find the container with id 341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595 Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.680974 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.827651 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.956450 4795 generic.go:334] "Generic (PLEG): container finished" podID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" exitCode=0 Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.956497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf"} Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.956527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerStarted","Data":"341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595"} Mar 20 17:35:34 crc kubenswrapper[4795]: I0320 17:35:34.113199 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:34 crc kubenswrapper[4795]: I0320 17:35:34.750756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:34 crc kubenswrapper[4795]: I0320 17:35:34.967331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerStarted","Data":"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670"} Mar 20 17:35:35 crc kubenswrapper[4795]: I0320 17:35:35.983518 4795 generic.go:334] "Generic (PLEG): container finished" podID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" exitCode=0 Mar 20 17:35:35 crc kubenswrapper[4795]: I0320 17:35:35.983741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670"} Mar 20 17:35:36 crc kubenswrapper[4795]: I0320 17:35:36.999036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerStarted","Data":"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400"} Mar 20 17:35:37 crc kubenswrapper[4795]: I0320 17:35:37.038509 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9sc2c" podStartSLOduration=2.559722849 podStartE2EDuration="5.038475774s" podCreationTimestamp="2026-03-20 17:35:32 +0000 UTC" firstStartedPulling="2026-03-20 17:35:33.957960046 +0000 UTC m=+1077.415991607" lastFinishedPulling="2026-03-20 17:35:36.436712961 +0000 UTC m=+1079.894744532" observedRunningTime="2026-03-20 17:35:37.026213338 +0000 UTC m=+1080.484244919" watchObservedRunningTime="2026-03-20 17:35:37.038475774 +0000 UTC m=+1080.496507365" Mar 20 17:35:43 crc kubenswrapper[4795]: I0320 17:35:43.223065 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:43 crc kubenswrapper[4795]: I0320 17:35:43.223771 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:43 crc kubenswrapper[4795]: I0320 17:35:43.300000 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:44 crc kubenswrapper[4795]: I0320 17:35:44.137196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:44 crc kubenswrapper[4795]: I0320 17:35:44.190942 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.083552 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9sc2c" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" containerID="cri-o://c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" gracePeriod=2 Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.585904 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.607635 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.607725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.607788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.608879 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities" (OuterVolumeSpecName: "utilities") pod "28a28038-950d-4a3f-bbac-2084cf8e0ac4" (UID: "28a28038-950d-4a3f-bbac-2084cf8e0ac4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.609784 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.616006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9" (OuterVolumeSpecName: "kube-api-access-7vbh9") pod "28a28038-950d-4a3f-bbac-2084cf8e0ac4" (UID: "28a28038-950d-4a3f-bbac-2084cf8e0ac4"). InnerVolumeSpecName "kube-api-access-7vbh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.654069 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28a28038-950d-4a3f-bbac-2084cf8e0ac4" (UID: "28a28038-950d-4a3f-bbac-2084cf8e0ac4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.711015 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.711053 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.097341 4795 generic.go:334] "Generic (PLEG): container finished" podID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" exitCode=0 Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.097449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.097435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400"} Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.098041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595"} Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.098086 4795 scope.go:117] "RemoveContainer" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.135812 4795 scope.go:117] "RemoveContainer" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.155042 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.161938 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.173295 4795 scope.go:117] "RemoveContainer" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.219851 4795 scope.go:117] "RemoveContainer" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" Mar 20 17:35:47 crc kubenswrapper[4795]: E0320 17:35:47.220370 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400\": container with ID starting with c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400 not found: ID does not exist" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.220437 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400"} err="failed to get container status \"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400\": rpc error: code = NotFound desc = could not find container \"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400\": container with ID starting with c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400 not found: ID does not exist" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.220476 4795 scope.go:117] "RemoveContainer" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" Mar 20 17:35:47 crc kubenswrapper[4795]: E0320 17:35:47.221183 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670\": container with ID starting with 4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670 not found: ID does not exist" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.221238 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670"} err="failed to get container status \"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670\": rpc error: code = NotFound desc = could not find container \"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670\": container with ID starting with 4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670 not found: ID does not exist" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.221274 4795 scope.go:117] "RemoveContainer" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" Mar 20 17:35:47 crc kubenswrapper[4795]: E0320 17:35:47.221681 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf\": container with ID starting with a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf not found: ID does not exist" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.221738 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf"} err="failed to get container status \"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf\": rpc error: code = NotFound desc = could not find container \"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf\": container with ID starting with a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf not found: ID does not exist" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.263828 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" path="/var/lib/kubelet/pods/28a28038-950d-4a3f-bbac-2084cf8e0ac4/volumes" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.585733 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:53 crc kubenswrapper[4795]: E0320 17:35:53.586482 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-utilities" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586495 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-utilities" Mar 20 17:35:53 crc kubenswrapper[4795]: E0320 17:35:53.586504 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-content" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586510 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-content" Mar 20 17:35:53 crc kubenswrapper[4795]: E0320 17:35:53.586521 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586560 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586676 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.587347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.594596 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.594896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kk7jg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.595341 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.595456 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.605340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.615328 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.616397 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.617940 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.642352 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.728876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.728941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.729499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.729568 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.729665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830753 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.831823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.832094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.832451 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.857567 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.866455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.904482 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.930547 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:54 crc kubenswrapper[4795]: I0320 17:35:54.371724 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:54 crc kubenswrapper[4795]: W0320 17:35:54.426993 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11fbdcb2_cc31_4fe8_be5f_80df050a7a93.slice/crio-c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f WatchSource:0}: Error finding container c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f: Status 404 returned error can't find the container with id c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f Mar 20 17:35:54 crc kubenswrapper[4795]: I0320 17:35:54.427472 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:55 crc kubenswrapper[4795]: I0320 17:35:55.160718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" event={"ID":"11fbdcb2-cc31-4fe8-be5f-80df050a7a93","Type":"ContainerStarted","Data":"c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f"} Mar 20 17:35:55 crc kubenswrapper[4795]: I0320 17:35:55.162519 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" event={"ID":"7f272673-089e-4e0d-ad79-ee04004f6c62","Type":"ContainerStarted","Data":"c52b78bf8dbc75f740fc999798595970e27f414c3577a45dac0538a7d22d794f"} Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.173720 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.195738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.197561 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.201535 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.272516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.272625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.272657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.374377 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.374438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.374472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.375431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.375917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.400292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.513551 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.521320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.560772 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.562629 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.567149 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.584659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.584957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.585068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.819296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.819574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.819599 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.820364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.820891 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.857840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.944018 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.158243 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.168967 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.170233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.172173 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.172385 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-84wwv" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.172572 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173231 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173416 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173442 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173501 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.184968 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326873 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327131 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327861 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.329608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.330137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.330862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.331652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.332563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.333807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.342491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.349533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.396310 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.457723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.459467 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.465362 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.465443 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466202 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466358 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466514 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466860 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pf5bc" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.483857 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.498569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530108 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530151 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631545 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.633351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.633744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.634060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.634175 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.636671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.643764 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.653620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.660365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.661255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.661275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.662834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.663682 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.831654 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.196131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerStarted","Data":"eceba19972887264330468d5439623bcd97b3e95bfc235711a65beb5805973f3"} Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.877746 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.879117 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.892136 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.897977 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.903050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.903781 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5tdxw" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.904106 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.913719 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwzw\" (UniqueName: \"kubernetes.io/projected/0f5a24ef-fc80-4386-9f81-5f21154223f3-kube-api-access-npwzw\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153280 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153399 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153552 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwzw\" (UniqueName: \"kubernetes.io/projected/0f5a24ef-fc80-4386-9f81-5f21154223f3-kube-api-access-npwzw\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.154442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.154584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.165414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.169545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.176356 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwzw\" (UniqueName: \"kubernetes.io/projected/0f5a24ef-fc80-4386-9f81-5f21154223f3-kube-api-access-npwzw\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.177123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.179444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.242456 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.136777 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.138207 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.142231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.142408 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.143108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.152437 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.270734 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"auto-csr-approver-29567136-j4mtv\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.311877 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.313904 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.317050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vttrp" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.317320 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.320210 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.320502 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.328158 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.372456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"auto-csr-approver-29567136-j4mtv\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.405431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"auto-csr-approver-29567136-j4mtv\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.473912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhwkw\" (UniqueName: \"kubernetes.io/projected/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kube-api-access-zhwkw\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.475072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.475240 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.475437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.491129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhwkw\" (UniqueName: \"kubernetes.io/projected/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kube-api-access-zhwkw\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.577023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.577412 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.578672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.577819 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.579329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.580006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.580320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.583267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.614190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhwkw\" (UniqueName: \"kubernetes.io/projected/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kube-api-access-zhwkw\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.617908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.631778 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.776907 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.778086 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.782576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.782879 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.783087 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4ttzr" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.790259 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881477 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-kolla-config\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t84rh\" (UniqueName: \"kubernetes.io/projected/ca95ec62-fce9-4c91-bb59-fa80f512edba-kube-api-access-t84rh\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-config-data\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.983975 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t84rh\" (UniqueName: \"kubernetes.io/projected/ca95ec62-fce9-4c91-bb59-fa80f512edba-kube-api-access-t84rh\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-config-data\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-kolla-config\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.985459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-kolla-config\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.985910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-config-data\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.989476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.995049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:01 crc kubenswrapper[4795]: I0320 17:36:01.001003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t84rh\" (UniqueName: \"kubernetes.io/projected/ca95ec62-fce9-4c91-bb59-fa80f512edba-kube-api-access-t84rh\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:01 crc kubenswrapper[4795]: I0320 17:36:01.100827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:36:01 crc kubenswrapper[4795]: I0320 17:36:01.263276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerStarted","Data":"a6075430a5a7a42dcee3b92020556eee6821261a85421f6ff8cc34985b56804c"} Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.779959 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.782230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.784367 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dwl8l" Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.798984 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.932146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"kube-state-metrics-0\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:36:03 crc kubenswrapper[4795]: I0320 17:36:03.035923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"kube-state-metrics-0\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:36:03 crc kubenswrapper[4795]: I0320 17:36:03.083905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"kube-state-metrics-0\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:36:03 crc kubenswrapper[4795]: I0320 17:36:03.168657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.061628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnp2g"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.062866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.065241 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tm86q" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.065462 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.065643 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.071161 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dsqcc"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.072668 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.076956 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.098164 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dsqcc"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-log\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-ovn-controller-tls-certs\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5675bf5e-3a57-4082-8631-680ced6fb634-scripts\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-lib\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8j4m\" (UniqueName: \"kubernetes.io/projected/28df10bb-d6a9-47a9-9b79-0bb9665529ef-kube-api-access-h8j4m\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-run\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190553 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-log-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-etc-ovs\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfrvv\" (UniqueName: \"kubernetes.io/projected/5675bf5e-3a57-4082-8631-680ced6fb634-kube-api-access-dfrvv\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28df10bb-d6a9-47a9-9b79-0bb9665529ef-scripts\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-combined-ca-bundle\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.291961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-combined-ca-bundle\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.291999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28df10bb-d6a9-47a9-9b79-0bb9665529ef-scripts\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-log\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-ovn-controller-tls-certs\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292098 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5675bf5e-3a57-4082-8631-680ced6fb634-scripts\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-lib\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8j4m\" (UniqueName: \"kubernetes.io/projected/28df10bb-d6a9-47a9-9b79-0bb9665529ef-kube-api-access-h8j4m\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-run\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292169 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-log-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-etc-ovs\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfrvv\" (UniqueName: \"kubernetes.io/projected/5675bf5e-3a57-4082-8631-680ced6fb634-kube-api-access-dfrvv\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-log-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-etc-ovs\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-log\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294221 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-lib\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-run\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.296954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5675bf5e-3a57-4082-8631-680ced6fb634-scripts\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.297197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28df10bb-d6a9-47a9-9b79-0bb9665529ef-scripts\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.297321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.297892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-ovn-controller-tls-certs\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.308299 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8j4m\" (UniqueName: \"kubernetes.io/projected/28df10bb-d6a9-47a9-9b79-0bb9665529ef-kube-api-access-h8j4m\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.309783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-combined-ca-bundle\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.314094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfrvv\" (UniqueName: \"kubernetes.io/projected/5675bf5e-3a57-4082-8631-680ced6fb634-kube-api-access-dfrvv\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.402451 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.404084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.839044 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.840631 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.843158 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.843219 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.844225 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.844333 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.844646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6cdhb" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.864835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946259 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946638 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srznh\" (UniqueName: \"kubernetes.io/projected/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-kube-api-access-srznh\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947520 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.011406 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.012088 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbfdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2zkfg_openstack(7f272673-089e-4e0d-ad79-ee04004f6c62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.013878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" podUID="7f272673-089e-4e0d-ad79-ee04004f6c62" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.025636 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.025778 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmpx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-t4qxc_openstack(11fbdcb2-cc31-4fe8-be5f-80df050a7a93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.027043 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" podUID="11fbdcb2-cc31-4fe8-be5f-80df050a7a93" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srznh\" (UniqueName: \"kubernetes.io/projected/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-kube-api-access-srznh\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.050587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.051197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.051484 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.054646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.056510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.058512 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.068967 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.070805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.072605 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ldxcv" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.075092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.076275 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.076516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.076658 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.077206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srznh\" (UniqueName: \"kubernetes.io/projected/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-kube-api-access-srznh\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.091054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.098814 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d96tc\" (UniqueName: \"kubernetes.io/projected/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-kube-api-access-d96tc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150623 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150811 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.198391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253060 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d96tc\" (UniqueName: \"kubernetes.io/projected/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-kube-api-access-d96tc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253365 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.254208 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.258336 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.258430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.261216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.263701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.270080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.273930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d96tc\" (UniqueName: \"kubernetes.io/projected/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-kube-api-access-d96tc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.303789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.369460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.785124 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.795921 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: W0320 17:36:10.798268 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f5a24ef_fc80_4386_9f81_5f21154223f3.slice/crio-7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c WatchSource:0}: Error finding container 7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c: Status 404 returned error can't find the container with id 7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.811766 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.842180 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.846559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.977909 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"7f272673-089e-4e0d-ad79-ee04004f6c62\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987864 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"7f272673-089e-4e0d-ad79-ee04004f6c62\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.989316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config" (OuterVolumeSpecName: "config") pod "7f272673-089e-4e0d-ad79-ee04004f6c62" (UID: "7f272673-089e-4e0d-ad79-ee04004f6c62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.989619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11fbdcb2-cc31-4fe8-be5f-80df050a7a93" (UID: "11fbdcb2-cc31-4fe8-be5f-80df050a7a93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.993023 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.993094 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4" (OuterVolumeSpecName: "kube-api-access-fmpx4") pod "11fbdcb2-cc31-4fe8-be5f-80df050a7a93" (UID: "11fbdcb2-cc31-4fe8-be5f-80df050a7a93"). InnerVolumeSpecName "kube-api-access-fmpx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.993946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp" (OuterVolumeSpecName: "kube-api-access-mbfdp") pod "7f272673-089e-4e0d-ad79-ee04004f6c62" (UID: "7f272673-089e-4e0d-ad79-ee04004f6c62"). InnerVolumeSpecName "kube-api-access-mbfdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.001658 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config" (OuterVolumeSpecName: "config") pod "11fbdcb2-cc31-4fe8-be5f-80df050a7a93" (UID: "11fbdcb2-cc31-4fe8-be5f-80df050a7a93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.014633 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca95ec62_fce9_4c91_bb59_fa80f512edba.slice/crio-a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96 WatchSource:0}: Error finding container a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96: Status 404 returned error can't find the container with id a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.018327 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.035498 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.040409 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090173 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090209 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090222 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090234 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090246 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.097033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.101767 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b9d4ac2_2b66_441a_a6d4_0d467d857f99.slice/crio-59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8 WatchSource:0}: Error finding container 59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8: Status 404 returned error can't find the container with id 59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8 Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.201081 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07f346e_3e6c_41a5_bdda_67a4a5f04ba7.slice/crio-5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9 WatchSource:0}: Error finding container 5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9: Status 404 returned error can't find the container with id 5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.201308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.323066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerStarted","Data":"7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.324243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g" event={"ID":"28df10bb-d6a9-47a9-9b79-0bb9665529ef","Type":"ContainerStarted","Data":"c01b1888a8b541ad374dbfa8a6411bcdf2e68ac5cfc9aaf1bb0d5e9bce548c26"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.325396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" event={"ID":"11fbdcb2-cc31-4fe8-be5f-80df050a7a93","Type":"ContainerDied","Data":"c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.325405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.326820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerStarted","Data":"0e5a7ece35e45546c5839b24c64b62f1f72a3acb63297d6f97fc0dca60bde01d"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.330286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" event={"ID":"38f88deb-b38d-4c52-a901-baeb9da08559","Type":"ContainerStarted","Data":"d53c776ab6a465f3a075842cf379886815e4d1caf766e21f6fcc37ed80564b8a"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.332018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7","Type":"ContainerStarted","Data":"5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.333150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ca95ec62-fce9-4c91-bb59-fa80f512edba","Type":"ContainerStarted","Data":"a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.335232 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b9d4ac2-2b66-441a-a6d4-0d467d857f99","Type":"ContainerStarted","Data":"59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.336138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerStarted","Data":"12a00ee882324adc5e7b3fa5833c8430141d6a20302db2d5f549cf873b0d421d"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.337263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" event={"ID":"7f272673-089e-4e0d-ad79-ee04004f6c62","Type":"ContainerDied","Data":"c52b78bf8dbc75f740fc999798595970e27f414c3577a45dac0538a7d22d794f"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.337375 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.338510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerStarted","Data":"64b4d3bbfc53ca1c715ea49f2ff8de0536ae2c9c0fde4b3a6dd9f838d8a8b4c0"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.340449 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" exitCode=0 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.340748 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerDied","Data":"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.346887 4795 generic.go:334] "Generic (PLEG): container finished" podID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerID="bb90f0d50353c309b35e0e03c747d10c744c81a1b2cc09b5b5dec2672959655d" exitCode=0 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.346973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerDied","Data":"bb90f0d50353c309b35e0e03c747d10c744c81a1b2cc09b5b5dec2672959655d"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.356804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerStarted","Data":"715534f72ece852c083764840657cce952ec7708ddcedcd00af2caddc251418f"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.419844 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.432454 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.454171 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.458312 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:36:11 crc kubenswrapper[4795]: E0320 17:36:11.549983 4795 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 17:36:11 crc kubenswrapper[4795]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 17:36:11 crc kubenswrapper[4795]: > podSandboxID="eceba19972887264330468d5439623bcd97b3e95bfc235711a65beb5805973f3" Mar 20 17:36:11 crc kubenswrapper[4795]: E0320 17:36:11.550878 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:11 crc kubenswrapper[4795]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgldg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-w4cr8_openstack(188f326f-74f0-423d-9ae1-54aae0c1474e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 17:36:11 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:36:11 crc kubenswrapper[4795]: E0320 17:36:11.552142 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.922897 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dsqcc"] Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.932513 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5675bf5e_3a57_4082_8631_680ced6fb634.slice/crio-bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f WatchSource:0}: Error finding container bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f: Status 404 returned error can't find the container with id bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.366548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerStarted","Data":"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe"} Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.366765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.367767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f"} Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.388113 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" podStartSLOduration=7.317579223 podStartE2EDuration="16.388051601s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:36:01.051720565 +0000 UTC m=+1104.509752106" lastFinishedPulling="2026-03-20 17:36:10.122192943 +0000 UTC m=+1113.580224484" observedRunningTime="2026-03-20 17:36:12.382763455 +0000 UTC m=+1115.840794996" watchObservedRunningTime="2026-03-20 17:36:12.388051601 +0000 UTC m=+1115.846083162" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.278983 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fbdcb2-cc31-4fe8-be5f-80df050a7a93" path="/var/lib/kubelet/pods/11fbdcb2-cc31-4fe8-be5f-80df050a7a93/volumes" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.279676 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f272673-089e-4e0d-ad79-ee04004f6c62" path="/var/lib/kubelet/pods/7f272673-089e-4e0d-ad79-ee04004f6c62/volumes" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.378144 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerStarted","Data":"87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5"} Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.378436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.398064 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podStartSLOduration=4.487993861 podStartE2EDuration="17.398048982s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:35:57.20244868 +0000 UTC m=+1100.660480221" lastFinishedPulling="2026-03-20 17:36:10.112503801 +0000 UTC m=+1113.570535342" observedRunningTime="2026-03-20 17:36:13.394380557 +0000 UTC m=+1116.852412098" watchObservedRunningTime="2026-03-20 17:36:13.398048982 +0000 UTC m=+1116.856080523" Mar 20 17:36:16 crc kubenswrapper[4795]: I0320 17:36:16.945927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.004929 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.005126 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" containerID="cri-o://87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5" gracePeriod=10 Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.009927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.410609 4795 generic.go:334] "Generic (PLEG): container finished" podID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerID="87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5" exitCode=0 Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.410932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerDied","Data":"87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5"} Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.183495 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-n4gzx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.184364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.186191 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.205087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n4gzx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7zng\" (UniqueName: \"kubernetes.io/projected/85004117-20bc-474e-88f5-ce49032749ff-kube-api-access-c7zng\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovn-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-combined-ca-bundle\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85004117-20bc-474e-88f5-ce49032749ff-config\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovs-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.359627 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.360819 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.363025 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.375988 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446067 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7zng\" (UniqueName: \"kubernetes.io/projected/85004117-20bc-474e-88f5-ce49032749ff-kube-api-access-c7zng\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovn-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446461 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-combined-ca-bundle\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446540 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovn-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85004117-20bc-474e-88f5-ce49032749ff-config\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovs-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovs-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.447472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85004117-20bc-474e-88f5-ce49032749ff-config\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.464811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-combined-ca-bundle\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.465179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.470120 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:19 crc kubenswrapper[4795]: E0320 17:36:19.470663 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-gm462 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" podUID="5d9eecdd-6791-4b24-8855-6036767861cf" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.475233 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7zng\" (UniqueName: \"kubernetes.io/projected/85004117-20bc-474e-88f5-ce49032749ff-kube-api-access-c7zng\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.500544 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.501788 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.508429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.511103 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.532658 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.548882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.548943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.548997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.549097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.551795 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.551872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.553235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.584762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.649996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.751935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.753022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.753102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.753336 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.772450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.865620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.432981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.441810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.569544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.569801 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config" (OuterVolumeSpecName: "config") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.570072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.575228 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462" (OuterVolumeSpecName: "kube-api-access-gm462") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "kube-api-access-gm462". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671877 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671936 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671956 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671973 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:21 crc kubenswrapper[4795]: I0320 17:36:21.440501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:21 crc kubenswrapper[4795]: I0320 17:36:21.482006 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:21 crc kubenswrapper[4795]: I0320 17:36:21.488376 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.742027 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.916227 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"188f326f-74f0-423d-9ae1-54aae0c1474e\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.916338 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"188f326f-74f0-423d-9ae1-54aae0c1474e\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.916437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"188f326f-74f0-423d-9ae1-54aae0c1474e\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.943080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg" (OuterVolumeSpecName: "kube-api-access-jgldg") pod "188f326f-74f0-423d-9ae1-54aae0c1474e" (UID: "188f326f-74f0-423d-9ae1-54aae0c1474e"). InnerVolumeSpecName "kube-api-access-jgldg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.966140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config" (OuterVolumeSpecName: "config") pod "188f326f-74f0-423d-9ae1-54aae0c1474e" (UID: "188f326f-74f0-423d-9ae1-54aae0c1474e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.971805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "188f326f-74f0-423d-9ae1-54aae0c1474e" (UID: "188f326f-74f0-423d-9ae1-54aae0c1474e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.018910 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.018954 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.018965 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.261159 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9eecdd-6791-4b24-8855-6036767861cf" path="/var/lib/kubelet/pods/5d9eecdd-6791-4b24-8855-6036767861cf/volumes" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.457465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerDied","Data":"eceba19972887264330468d5439623bcd97b3e95bfc235711a65beb5805973f3"} Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.457514 4795 scope.go:117] "RemoveContainer" containerID="87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.457562 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.480008 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.487868 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:36:25 crc kubenswrapper[4795]: I0320 17:36:25.265752 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" path="/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volumes" Mar 20 17:36:26 crc kubenswrapper[4795]: I0320 17:36:26.523207 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: i/o timeout" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.415783 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.416454 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:34 crc kubenswrapper[4795]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 17:36:34 crc kubenswrapper[4795]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 17:36:34 crc kubenswrapper[4795]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4h5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(b8103489-e552-49b0-a32a-1069a46feff9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 17:36:34 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.417767 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" Mar 20 17:36:34 crc kubenswrapper[4795]: I0320 17:36:34.537424 4795 scope.go:117] "RemoveContainer" containerID="bb90f0d50353c309b35e0e03c747d10c744c81a1b2cc09b5b5dec2672959655d" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.579190 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.630935 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.631390 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:34 crc kubenswrapper[4795]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 17:36:34 crc kubenswrapper[4795]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 17:36:34 crc kubenswrapper[4795]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6r2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(d3e6834b-7e74-46f8-a734-b473080c05d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 17:36:34 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.633552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.057266 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n4gzx"] Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.135059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:35 crc kubenswrapper[4795]: W0320 17:36:35.145043 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a94015_8d98_4745_ab7a_74ebdd435638.slice/crio-ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96 WatchSource:0}: Error finding container ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96: Status 404 returned error can't find the container with id ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96 Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.562363 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a94015-8d98-4745-ab7a-74ebdd435638" containerID="c364cb833dc6a826503aaf9ccf6e20afbe333b743f6b2e28ddf8d915f44cb337" exitCode=0 Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.562417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerDied","Data":"c364cb833dc6a826503aaf9ccf6e20afbe333b743f6b2e28ddf8d915f44cb337"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.562647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerStarted","Data":"ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.565971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ca95ec62-fce9-4c91-bb59-fa80f512edba","Type":"ContainerStarted","Data":"df3795f1b066389261f49b06978897913a1ad9450900b8a1476362e278dd3477"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.566646 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.569047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n4gzx" event={"ID":"85004117-20bc-474e-88f5-ce49032749ff","Type":"ContainerStarted","Data":"137a2ea5670e7794f8668086fe831dd3103e9ff4f37fdb89e2c07bc0e196f6e1"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.570500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerStarted","Data":"be589416fed96b6271b6b5906e6f1d45a7a2e0530fadd8efe4c4d83d2ddd456b"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.572828 4795 generic.go:334] "Generic (PLEG): container finished" podID="38f88deb-b38d-4c52-a901-baeb9da08559" containerID="a9b37f38da5a02a709d41d5d8718cdfaffaae9f225a892b8a803fd7f9d1c5b9d" exitCode=0 Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.572875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" event={"ID":"38f88deb-b38d-4c52-a901-baeb9da08559","Type":"ContainerDied","Data":"a9b37f38da5a02a709d41d5d8718cdfaffaae9f225a892b8a803fd7f9d1c5b9d"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.576571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"7b2cecc174f5dcc6f6034c4b95f2a94c23f585710a36c097c1d013a8018c8d16"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.580353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerStarted","Data":"20d641e4b2c5af5a648fe4d600e82a2440ccb249de16db64b4e4b4b818f40e8b"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.587623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7","Type":"ContainerStarted","Data":"3c11b4ea967168fa5d174eecf8dab28998faefca6fdedab2fa357e8713bbe892"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.589610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g" event={"ID":"28df10bb-d6a9-47a9-9b79-0bb9665529ef","Type":"ContainerStarted","Data":"c670dee051c4c8511222332bcd79dd4320ea4d16c1fc4edf2416e15ce5d175dd"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.590779 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.599597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b9d4ac2-2b66-441a-a6d4-0d467d857f99","Type":"ContainerStarted","Data":"5caf37cf4351516be5d864c1286eb4d8795b6278dae1f859a384a32c06688c9f"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.605507 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=25.139101628 podStartE2EDuration="35.605488271s" podCreationTimestamp="2026-03-20 17:36:00 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.019018332 +0000 UTC m=+1114.477049893" lastFinishedPulling="2026-03-20 17:36:21.485404975 +0000 UTC m=+1124.943436536" observedRunningTime="2026-03-20 17:36:35.602162918 +0000 UTC m=+1139.060194479" watchObservedRunningTime="2026-03-20 17:36:35.605488271 +0000 UTC m=+1139.063519812" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.606921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerStarted","Data":"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.607042 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:36:35 crc kubenswrapper[4795]: E0320 17:36:35.610967 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.720604 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.032010815 podStartE2EDuration="33.720571573s" podCreationTimestamp="2026-03-20 17:36:02 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.978667025 +0000 UTC m=+1114.436698576" lastFinishedPulling="2026-03-20 17:36:34.667227793 +0000 UTC m=+1138.125259334" observedRunningTime="2026-03-20 17:36:35.714070622 +0000 UTC m=+1139.172102173" watchObservedRunningTime="2026-03-20 17:36:35.720571573 +0000 UTC m=+1139.178603135" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.736334 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dnp2g" podStartSLOduration=16.861927349 podStartE2EDuration="29.736316894s" podCreationTimestamp="2026-03-20 17:36:06 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.998854464 +0000 UTC m=+1114.456886005" lastFinishedPulling="2026-03-20 17:36:23.873244009 +0000 UTC m=+1127.331275550" observedRunningTime="2026-03-20 17:36:35.730512053 +0000 UTC m=+1139.188543594" watchObservedRunningTime="2026-03-20 17:36:35.736316894 +0000 UTC m=+1139.194348435" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.618396 4795 generic.go:334] "Generic (PLEG): container finished" podID="5675bf5e-3a57-4082-8631-680ced6fb634" containerID="7b2cecc174f5dcc6f6034c4b95f2a94c23f585710a36c097c1d013a8018c8d16" exitCode=0 Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerDied","Data":"7b2cecc174f5dcc6f6034c4b95f2a94c23f585710a36c097c1d013a8018c8d16"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619799 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"f62525ebd7dd589cb5ed4eef766150f5dadbae6274446cd1ee837efe0dd7dc83"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"ccb21af206808d8161b15ed779b3acf783c07dfbafc6494225d5a626c7d3ff23"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619830 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.624339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerStarted","Data":"e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.641189 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dsqcc" podStartSLOduration=19.879408314 podStartE2EDuration="30.641174022s" podCreationTimestamp="2026-03-20 17:36:06 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.93799878 +0000 UTC m=+1115.396030321" lastFinishedPulling="2026-03-20 17:36:22.699764488 +0000 UTC m=+1126.157796029" observedRunningTime="2026-03-20 17:36:36.636866618 +0000 UTC m=+1140.094898159" watchObservedRunningTime="2026-03-20 17:36:36.641174022 +0000 UTC m=+1140.099205563" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.668925 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" podStartSLOduration=17.668882626 podStartE2EDuration="17.668882626s" podCreationTimestamp="2026-03-20 17:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:36.660006699 +0000 UTC m=+1140.118038240" watchObservedRunningTime="2026-03-20 17:36:36.668882626 +0000 UTC m=+1140.126914167" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.904221 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.985704 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"38f88deb-b38d-4c52-a901-baeb9da08559\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.995963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk" (OuterVolumeSpecName: "kube-api-access-28trk") pod "38f88deb-b38d-4c52-a901-baeb9da08559" (UID: "38f88deb-b38d-4c52-a901-baeb9da08559"). InnerVolumeSpecName "kube-api-access-28trk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.087804 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.634096 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.634276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" event={"ID":"38f88deb-b38d-4c52-a901-baeb9da08559","Type":"ContainerDied","Data":"d53c776ab6a465f3a075842cf379886815e4d1caf766e21f6fcc37ed80564b8a"} Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.634788 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53c776ab6a465f3a075842cf379886815e4d1caf766e21f6fcc37ed80564b8a" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.635424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.966273 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.973494 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:36:38 crc kubenswrapper[4795]: I0320 17:36:38.644777 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f5a24ef-fc80-4386-9f81-5f21154223f3" containerID="20d641e4b2c5af5a648fe4d600e82a2440ccb249de16db64b4e4b4b818f40e8b" exitCode=0 Mar 20 17:36:38 crc kubenswrapper[4795]: I0320 17:36:38.645821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerDied","Data":"20d641e4b2c5af5a648fe4d600e82a2440ccb249de16db64b4e4b4b818f40e8b"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.263051 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" path="/var/lib/kubelet/pods/f93986a1-82a8-4eac-ba5e-f790196b25ce/volumes" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.656120 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987" containerID="be589416fed96b6271b6b5906e6f1d45a7a2e0530fadd8efe4c4d83d2ddd456b" exitCode=0 Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.656251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerDied","Data":"be589416fed96b6271b6b5906e6f1d45a7a2e0530fadd8efe4c4d83d2ddd456b"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.660164 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7","Type":"ContainerStarted","Data":"42c51c488fe4bfac9ef63685016e66545b68d399c83290387b1757181c2dd716"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.664359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b9d4ac2-2b66-441a-a6d4-0d467d857f99","Type":"ContainerStarted","Data":"a7b67360e8b08288f00db3af0c6989e32f69622da33969cfda3444f96aee94dc"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.667394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n4gzx" event={"ID":"85004117-20bc-474e-88f5-ce49032749ff","Type":"ContainerStarted","Data":"aa860c8d92a2db46347d66290b667c4d2b2232b38a2fb358259720ec2599f882"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.672528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerStarted","Data":"44d126d474f5c38d05ae811af673fb00ab0fc770694ad6081494c7edc888a203"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.730022 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=31.547693068 podStartE2EDuration="42.730000169s" podCreationTimestamp="2026-03-20 17:35:57 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.802632585 +0000 UTC m=+1114.260664126" lastFinishedPulling="2026-03-20 17:36:21.984939686 +0000 UTC m=+1125.442971227" observedRunningTime="2026-03-20 17:36:39.721799624 +0000 UTC m=+1143.179831205" watchObservedRunningTime="2026-03-20 17:36:39.730000169 +0000 UTC m=+1143.188031730" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.764306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.12783376 podStartE2EDuration="30.764286467s" podCreationTimestamp="2026-03-20 17:36:09 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.203473264 +0000 UTC m=+1114.661504815" lastFinishedPulling="2026-03-20 17:36:38.839925981 +0000 UTC m=+1142.297957522" observedRunningTime="2026-03-20 17:36:39.750931171 +0000 UTC m=+1143.208962722" watchObservedRunningTime="2026-03-20 17:36:39.764286467 +0000 UTC m=+1143.222318018" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.810032 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.114965009 podStartE2EDuration="31.81001274s" podCreationTimestamp="2026-03-20 17:36:08 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.104805242 +0000 UTC m=+1114.562836783" lastFinishedPulling="2026-03-20 17:36:38.799852973 +0000 UTC m=+1142.257884514" observedRunningTime="2026-03-20 17:36:39.779233842 +0000 UTC m=+1143.237265383" watchObservedRunningTime="2026-03-20 17:36:39.81001274 +0000 UTC m=+1143.268044291" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.839294 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-n4gzx" podStartSLOduration=17.05935214 podStartE2EDuration="20.839274471s" podCreationTimestamp="2026-03-20 17:36:19 +0000 UTC" firstStartedPulling="2026-03-20 17:36:35.087589418 +0000 UTC m=+1138.545620959" lastFinishedPulling="2026-03-20 17:36:38.867511749 +0000 UTC m=+1142.325543290" observedRunningTime="2026-03-20 17:36:39.833925824 +0000 UTC m=+1143.291957365" watchObservedRunningTime="2026-03-20 17:36:39.839274471 +0000 UTC m=+1143.297306012" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.199603 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.199647 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.253865 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.370142 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.370557 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.419450 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.683300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerStarted","Data":"e9df1e53f87c68e30f263f17ef84197934d01922d65cbffbe45f43cf8a087e5d"} Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.715909 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.661704817 podStartE2EDuration="41.71588792s" podCreationTimestamp="2026-03-20 17:35:59 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.812322727 +0000 UTC m=+1114.270354268" lastFinishedPulling="2026-03-20 17:36:23.86650583 +0000 UTC m=+1127.324537371" observedRunningTime="2026-03-20 17:36:40.711198055 +0000 UTC m=+1144.169229616" watchObservedRunningTime="2026-03-20 17:36:40.71588792 +0000 UTC m=+1144.173919461" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.733970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.748465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.020968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:36:41 crc kubenswrapper[4795]: E0320 17:36:41.021299 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" containerName="oc" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021318 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" containerName="oc" Mar 20 17:36:41 crc kubenswrapper[4795]: E0320 17:36:41.021353 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="init" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021362 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="init" Mar 20 17:36:41 crc kubenswrapper[4795]: E0320 17:36:41.021393 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021402 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021571 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021587 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" containerName="oc" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.022512 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.024776 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.024926 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7bwm5" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.026388 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.027658 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.043211 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.078461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082542 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-config\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brh4g\" (UniqueName: \"kubernetes.io/projected/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-kube-api-access-brh4g\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082978 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-scripts\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.083078 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.083156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.101927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185163 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brh4g\" (UniqueName: \"kubernetes.io/projected/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-kube-api-access-brh4g\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-scripts\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185340 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185435 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-config\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185500 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.186350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-scripts\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.186485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-config\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.190150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.213424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.214608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.222468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brh4g\" (UniqueName: \"kubernetes.io/projected/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-kube-api-access-brh4g\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.358364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.825520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:36:41 crc kubenswrapper[4795]: W0320 17:36:41.830710 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cfc9397_7268_4bd1_8bbf_d107e94ab35a.slice/crio-1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838 WatchSource:0}: Error finding container 1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838: Status 404 returned error can't find the container with id 1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838 Mar 20 17:36:42 crc kubenswrapper[4795]: I0320 17:36:42.696862 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6cfc9397-7268-4bd1-8bbf-d107e94ab35a","Type":"ContainerStarted","Data":"1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838"} Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.205909 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.206847 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" containerID="cri-o://e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68" gracePeriod=10 Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.207023 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.208145 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.236315 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.243133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.277473 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323018 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323176 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323236 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: E0320 17:36:43.403181 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a94015_8d98_4745_ab7a_74ebdd435638.slice/crio-e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a94015_8d98_4745_ab7a_74ebdd435638.slice/crio-conmon-e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.424618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.426142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.426876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.446823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.610702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.708659 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a94015-8d98-4745-ab7a-74ebdd435638" containerID="e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68" exitCode=0 Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.708715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerDied","Data":"e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68"} Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.901334 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.042024 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699" (OuterVolumeSpecName: "kube-api-access-gm699") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "kube-api-access-gm699". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.100149 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config" (OuterVolumeSpecName: "config") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.102420 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.102545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.103981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136221 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136249 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136257 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136267 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136277 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.264260 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:36:44 crc kubenswrapper[4795]: W0320 17:36:44.264546 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod601af69d_c03f_4bdf_b3bf_67ba791674f9.slice/crio-f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6 WatchSource:0}: Error finding container f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6: Status 404 returned error can't find the container with id f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6 Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.365479 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.365844 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.365859 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.365876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="init" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.365883 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="init" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.366035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.394581 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.397259 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dvvhq" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.397626 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.397672 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.398565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.415062 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4vs\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-kube-api-access-xx4vs\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e955e5-ba7a-4582-9d52-40333fe21b7f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-cache\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440627 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-lock\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-cache\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-lock\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4vs\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-kube-api-access-xx4vs\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e955e5-ba7a-4582-9d52-40333fe21b7f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543073 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-cache\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.543249 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.543268 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.543322 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:45.043303789 +0000 UTC m=+1148.501335330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543334 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-lock\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.547578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e955e5-ba7a-4582-9d52-40333fe21b7f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.561725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4vs\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-kube-api-access-xx4vs\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.568395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.717940 4795 generic.go:334] "Generic (PLEG): container finished" podID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerID="a43865e4251904d08d5c0655d2fd65e83c2843454a9cfbf7734d7aa91dad11f3" exitCode=0 Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.718006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerDied","Data":"a43865e4251904d08d5c0655d2fd65e83c2843454a9cfbf7734d7aa91dad11f3"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.718100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerStarted","Data":"f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.721344 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6cfc9397-7268-4bd1-8bbf-d107e94ab35a","Type":"ContainerStarted","Data":"3885d095895d5da5a1b34a80367bb40ec5b285efc2098d1ba8e7cbc11155364b"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.721387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6cfc9397-7268-4bd1-8bbf-d107e94ab35a","Type":"ContainerStarted","Data":"246986c0d6bbfde2470a9586085454fa6243ebcee0f179784cf0c7047eaf3732"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.721526 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.723835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerDied","Data":"ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.723875 4795 scope.go:117] "RemoveContainer" containerID="e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.724049 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.764603 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.704862109 podStartE2EDuration="4.764585498s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="2026-03-20 17:36:41.833131411 +0000 UTC m=+1145.291162952" lastFinishedPulling="2026-03-20 17:36:43.89285479 +0000 UTC m=+1147.350886341" observedRunningTime="2026-03-20 17:36:44.762425321 +0000 UTC m=+1148.220456872" watchObservedRunningTime="2026-03-20 17:36:44.764585498 +0000 UTC m=+1148.222617049" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.835723 4795 scope.go:117] "RemoveContainer" containerID="c364cb833dc6a826503aaf9ccf6e20afbe333b743f6b2e28ddf8d915f44cb337" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.861817 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.868985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.051025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:45 crc kubenswrapper[4795]: E0320 17:36:45.051248 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:45 crc kubenswrapper[4795]: E0320 17:36:45.051267 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:45 crc kubenswrapper[4795]: E0320 17:36:45.051325 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:46.051305893 +0000 UTC m=+1149.509337434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.261363 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" path="/var/lib/kubelet/pods/32a94015-8d98-4745-ab7a-74ebdd435638/volumes" Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.736793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerStarted","Data":"7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc"} Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.736956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.768214 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-8grln" podStartSLOduration=2.76819001 podStartE2EDuration="2.76819001s" podCreationTimestamp="2026-03-20 17:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:45.756001191 +0000 UTC m=+1149.214032782" watchObservedRunningTime="2026-03-20 17:36:45.76819001 +0000 UTC m=+1149.226221591" Mar 20 17:36:46 crc kubenswrapper[4795]: I0320 17:36:46.064978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:46 crc kubenswrapper[4795]: E0320 17:36:46.065239 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:46 crc kubenswrapper[4795]: E0320 17:36:46.065277 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:46 crc kubenswrapper[4795]: E0320 17:36:46.065352 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:48.06533119 +0000 UTC m=+1151.523362741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.102809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:48 crc kubenswrapper[4795]: E0320 17:36:48.102974 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: E0320 17:36:48.103754 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: E0320 17:36:48.103832 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:52.103807389 +0000 UTC m=+1155.561838960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.283473 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m8zw5"] Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.285539 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.288366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.289023 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.290620 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m8zw5"] Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306606 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.407638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408653 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.409134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.409139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.414257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.414373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.414437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.431286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.613597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.192663 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m8zw5"] Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.243786 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.244340 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.336159 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.659513 4795 scope.go:117] "RemoveContainer" containerID="56b4e175842a208b79b6d416a354b0c057585a391bd973a1b6ce26b23a0cd738" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.771954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerStarted","Data":"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157"} Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.773713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerStarted","Data":"97fb6267b98d2b148faab38e0f46037ac0fdd70749948e0fa5391492bff624c1"} Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.880657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.631983 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.632042 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.716130 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.789221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerStarted","Data":"5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea"} Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.897289 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.306194 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.307526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.309563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.324858 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.343892 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.348626 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.356939 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.565440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.566105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.585665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.585671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.627424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.672432 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.012818 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.014029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.027603 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.081006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.081249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.145381 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.146576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.148359 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.158087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.183296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.183364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: E0320 17:36:52.183407 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:52 crc kubenswrapper[4795]: E0320 17:36:52.183423 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.183438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: E0320 17:36:52.183463 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:37:00.183447501 +0000 UTC m=+1163.641479042 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.184045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.202052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.244058 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.245097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.260400 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.285384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.285522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.348992 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.353186 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.366058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.366168 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.369083 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390608 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390762 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.393419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.406433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.462768 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.493370 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.513594 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.564013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.594155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.594258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.595136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.610486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.698574 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.614928 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.618213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:36:53 crc kubenswrapper[4795]: W0320 17:36:53.620102 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aac28d5_6b58_424e_83f8_ec71c53e41ce.slice/crio-690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c WatchSource:0}: Error finding container 690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c: Status 404 returned error can't find the container with id 690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.634010 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.639667 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.679094 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.679285 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" containerID="cri-o://cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" gracePeriod=10 Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.772295 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:36:53 crc kubenswrapper[4795]: W0320 17:36:53.782941 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9265b8c_0b80_47d9_8f4b_3d996233341e.slice/crio-e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876 WatchSource:0}: Error finding container e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876: Status 404 returned error can't find the container with id e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876 Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.791818 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.802388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.817645 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8d96q" event={"ID":"389c1f10-5aba-4c4d-b0b3-3a38f6038536","Type":"ContainerStarted","Data":"10d072b8e1df887899cd7e286a697237d05c31ae70d12b646ba13bce102cacf2"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.824643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a409-account-create-update-zvscf" event={"ID":"acfb1ea8-a8d2-4152-ad18-54d380b289c4","Type":"ContainerStarted","Data":"74a7613978faa2356589d41700c5d796593e713d0f9d37d06f172fe2c54e0f4d"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.825901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerStarted","Data":"5d7bc73635aabb988bf8536345a728b4481d0d64844c13a801df663371346492"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.828582 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30ae-account-create-update-d79gp" event={"ID":"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87","Type":"ContainerStarted","Data":"9819bc13c4d025d57bc13c848c09e31cf44875556c9e7737142ef481557663b2"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.836202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1681-account-create-update-vpwb2" event={"ID":"c9265b8c-0b80-47d9-8f4b-3d996233341e","Type":"ContainerStarted","Data":"e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.837585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ncfp9" event={"ID":"6aac28d5-6b58-424e-83f8-ec71c53e41ce","Type":"ContainerStarted","Data":"690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.865648 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-m8zw5" podStartSLOduration=1.883358547 podStartE2EDuration="5.865622577s" podCreationTimestamp="2026-03-20 17:36:48 +0000 UTC" firstStartedPulling="2026-03-20 17:36:49.195614708 +0000 UTC m=+1152.653646239" lastFinishedPulling="2026-03-20 17:36:53.177878728 +0000 UTC m=+1156.635910269" observedRunningTime="2026-03-20 17:36:53.845862442 +0000 UTC m=+1157.303893983" watchObservedRunningTime="2026-03-20 17:36:53.865622577 +0000 UTC m=+1157.323654118" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.169317 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.259390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.259473 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.259648 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.275636 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs" (OuterVolumeSpecName: "kube-api-access-ccwzs") pod "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" (UID: "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2"). InnerVolumeSpecName "kube-api-access-ccwzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.302096 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config" (OuterVolumeSpecName: "config") pod "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" (UID: "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.309289 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" (UID: "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.361573 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.361616 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.361629 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845278 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845349 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerDied","Data":"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerDied","Data":"a6075430a5a7a42dcee3b92020556eee6821261a85421f6ff8cc34985b56804c"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845410 4795 scope.go:117] "RemoveContainer" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.846920 4795 generic.go:334] "Generic (PLEG): container finished" podID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerID="1fe0f6a7ba267ec0588c7d4179b78569ace69ae42af4f4ce02a9e28bfc87aa93" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.846996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a409-account-create-update-zvscf" event={"ID":"acfb1ea8-a8d2-4152-ad18-54d380b289c4","Type":"ContainerDied","Data":"1fe0f6a7ba267ec0588c7d4179b78569ace69ae42af4f4ce02a9e28bfc87aa93"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.848210 4795 generic.go:334] "Generic (PLEG): container finished" podID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerID="115e8aa5e635a588311da0792150e7730feaab865eb0acb01117eb70b42bfde3" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.848281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c5rg6" event={"ID":"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc","Type":"ContainerDied","Data":"115e8aa5e635a588311da0792150e7730feaab865eb0acb01117eb70b42bfde3"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.848300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c5rg6" event={"ID":"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc","Type":"ContainerStarted","Data":"a52e3060dd33f381cd140ac8936ebc19848c276f6cba28e6c942f6eae0bfa041"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.850348 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerID="fa7f816c765d44ed743198c38348dd663b04f7cfc3b7f6aac5dffa2623d4db45" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.850396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30ae-account-create-update-d79gp" event={"ID":"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87","Type":"ContainerDied","Data":"fa7f816c765d44ed743198c38348dd663b04f7cfc3b7f6aac5dffa2623d4db45"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.852179 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerID="c288b4ff895d130555ade7ce513d591493310d7fb3678ee47968d204fe11297a" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.852245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1681-account-create-update-vpwb2" event={"ID":"c9265b8c-0b80-47d9-8f4b-3d996233341e","Type":"ContainerDied","Data":"c288b4ff895d130555ade7ce513d591493310d7fb3678ee47968d204fe11297a"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.853958 4795 generic.go:334] "Generic (PLEG): container finished" podID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerID="2d63356a6d0232331bb76203b1359e46e0f2a21a5ebc5f3160865388f8cf9a1c" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.853996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ncfp9" event={"ID":"6aac28d5-6b58-424e-83f8-ec71c53e41ce","Type":"ContainerDied","Data":"2d63356a6d0232331bb76203b1359e46e0f2a21a5ebc5f3160865388f8cf9a1c"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.855332 4795 generic.go:334] "Generic (PLEG): container finished" podID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerID="61c0a7747547de21c917366527d52306c78b302a64918960d0e832416be0ca0f" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.855426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8d96q" event={"ID":"389c1f10-5aba-4c4d-b0b3-3a38f6038536","Type":"ContainerDied","Data":"61c0a7747547de21c917366527d52306c78b302a64918960d0e832416be0ca0f"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.871489 4795 scope.go:117] "RemoveContainer" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.914254 4795 scope.go:117] "RemoveContainer" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" Mar 20 17:36:54 crc kubenswrapper[4795]: E0320 17:36:54.914780 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe\": container with ID starting with cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe not found: ID does not exist" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.914836 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe"} err="failed to get container status \"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe\": rpc error: code = NotFound desc = could not find container \"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe\": container with ID starting with cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe not found: ID does not exist" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.914870 4795 scope.go:117] "RemoveContainer" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" Mar 20 17:36:54 crc kubenswrapper[4795]: E0320 17:36:54.915264 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589\": container with ID starting with dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589 not found: ID does not exist" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.915309 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589"} err="failed to get container status \"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589\": rpc error: code = NotFound desc = could not find container \"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589\": container with ID starting with dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589 not found: ID does not exist" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.987226 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.992321 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:36:55 crc kubenswrapper[4795]: I0320 17:36:55.265397 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" path="/var/lib/kubelet/pods/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2/volumes" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.174101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.298724 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.304162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.304278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.305077 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" (UID: "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.316884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82" (OuterVolumeSpecName: "kube-api-access-rnn82") pod "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" (UID: "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87"). InnerVolumeSpecName "kube-api-access-rnn82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409177 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"c9265b8c-0b80-47d9-8f4b-3d996233341e\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409306 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"c9265b8c-0b80-47d9-8f4b-3d996233341e\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409776 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409793 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.413567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9265b8c-0b80-47d9-8f4b-3d996233341e" (UID: "c9265b8c-0b80-47d9-8f4b-3d996233341e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.414230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4" (OuterVolumeSpecName: "kube-api-access-hppz4") pod "c9265b8c-0b80-47d9-8f4b-3d996233341e" (UID: "c9265b8c-0b80-47d9-8f4b-3d996233341e"). InnerVolumeSpecName "kube-api-access-hppz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.457574 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.493818 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.495758 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.510239 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.511700 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.511723 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613368 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.615851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "acfb1ea8-a8d2-4152-ad18-54d380b289c4" (UID: "acfb1ea8-a8d2-4152-ad18-54d380b289c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.617409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6aac28d5-6b58-424e-83f8-ec71c53e41ce" (UID: "6aac28d5-6b58-424e-83f8-ec71c53e41ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.617701 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "389c1f10-5aba-4c4d-b0b3-3a38f6038536" (UID: "389c1f10-5aba-4c4d-b0b3-3a38f6038536"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.617854 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" (UID: "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.618864 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n" (OuterVolumeSpecName: "kube-api-access-wg59n") pod "389c1f10-5aba-4c4d-b0b3-3a38f6038536" (UID: "389c1f10-5aba-4c4d-b0b3-3a38f6038536"). InnerVolumeSpecName "kube-api-access-wg59n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.619356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58" (OuterVolumeSpecName: "kube-api-access-rbd58") pod "6aac28d5-6b58-424e-83f8-ec71c53e41ce" (UID: "6aac28d5-6b58-424e-83f8-ec71c53e41ce"). InnerVolumeSpecName "kube-api-access-rbd58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.619441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m" (OuterVolumeSpecName: "kube-api-access-88z4m") pod "acfb1ea8-a8d2-4152-ad18-54d380b289c4" (UID: "acfb1ea8-a8d2-4152-ad18-54d380b289c4"). InnerVolumeSpecName "kube-api-access-88z4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.620544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb" (OuterVolumeSpecName: "kube-api-access-78bqb") pod "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" (UID: "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc"). InnerVolumeSpecName "kube-api-access-78bqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715764 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715805 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715819 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715833 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715845 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715856 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715868 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715881 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.886609 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a409-account-create-update-zvscf" event={"ID":"acfb1ea8-a8d2-4152-ad18-54d380b289c4","Type":"ContainerDied","Data":"74a7613978faa2356589d41700c5d796593e713d0f9d37d06f172fe2c54e0f4d"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.886660 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74a7613978faa2356589d41700c5d796593e713d0f9d37d06f172fe2c54e0f4d" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.886748 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.892296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c5rg6" event={"ID":"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc","Type":"ContainerDied","Data":"a52e3060dd33f381cd140ac8936ebc19848c276f6cba28e6c942f6eae0bfa041"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.892389 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52e3060dd33f381cd140ac8936ebc19848c276f6cba28e6c942f6eae0bfa041" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.892491 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.895145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30ae-account-create-update-d79gp" event={"ID":"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87","Type":"ContainerDied","Data":"9819bc13c4d025d57bc13c848c09e31cf44875556c9e7737142ef481557663b2"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.895180 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9819bc13c4d025d57bc13c848c09e31cf44875556c9e7737142ef481557663b2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.895240 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.897654 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.897640 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1681-account-create-update-vpwb2" event={"ID":"c9265b8c-0b80-47d9-8f4b-3d996233341e","Type":"ContainerDied","Data":"e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.898034 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.902957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ncfp9" event={"ID":"6aac28d5-6b58-424e-83f8-ec71c53e41ce","Type":"ContainerDied","Data":"690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.903006 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.903102 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.910709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8d96q" event={"ID":"389c1f10-5aba-4c4d-b0b3-3a38f6038536","Type":"ContainerDied","Data":"10d072b8e1df887899cd7e286a697237d05c31ae70d12b646ba13bce102cacf2"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.910740 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10d072b8e1df887899cd7e286a697237d05c31ae70d12b646ba13bce102cacf2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.910824 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.870534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871345 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871361 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871378 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871404 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="init" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871414 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="init" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871427 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871436 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871451 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871480 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871491 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871499 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871514 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871522 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871566 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871575 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871805 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871821 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871857 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871873 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871890 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871904 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871944 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.872826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.876845 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.886750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.041354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.041440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.142734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.142908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.143611 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.185032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.204916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.644893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.940358 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c74j8" event={"ID":"135e322f-177c-4bbb-bb3d-0ab19eba6f92","Type":"ContainerStarted","Data":"7997aceefc0685263d40f8dcf307e7e1ba659185b7b47b791ff788b49578f153"} Mar 20 17:36:59 crc kubenswrapper[4795]: I0320 17:36:59.957702 4795 generic.go:334] "Generic (PLEG): container finished" podID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerID="64da53ed17d6e7c8ed644863f568fc0f6e5e946972ad8fd66ba6db39c157b1e6" exitCode=0 Mar 20 17:36:59 crc kubenswrapper[4795]: I0320 17:36:59.957763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c74j8" event={"ID":"135e322f-177c-4bbb-bb3d-0ab19eba6f92","Type":"ContainerDied","Data":"64da53ed17d6e7c8ed644863f568fc0f6e5e946972ad8fd66ba6db39c157b1e6"} Mar 20 17:37:00 crc kubenswrapper[4795]: I0320 17:37:00.281191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:37:00 crc kubenswrapper[4795]: E0320 17:37:00.281426 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:37:00 crc kubenswrapper[4795]: E0320 17:37:00.281456 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:37:00 crc kubenswrapper[4795]: E0320 17:37:00.281515 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:37:16.281494646 +0000 UTC m=+1179.739526187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.314238 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.408896 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.408986 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.409860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "135e322f-177c-4bbb-bb3d-0ab19eba6f92" (UID: "135e322f-177c-4bbb-bb3d-0ab19eba6f92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.417164 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn" (OuterVolumeSpecName: "kube-api-access-j4wxn") pod "135e322f-177c-4bbb-bb3d-0ab19eba6f92" (UID: "135e322f-177c-4bbb-bb3d-0ab19eba6f92"). InnerVolumeSpecName "kube-api-access-j4wxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.458031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.511176 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.511215 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.636230 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:37:01 crc kubenswrapper[4795]: E0320 17:37:01.636603 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerName="mariadb-account-create-update" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.636623 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerName="mariadb-account-create-update" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.636833 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerName="mariadb-account-create-update" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.637470 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.640657 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s6lrt" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.649382 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.650995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.816858 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.816907 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.816959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.817156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919101 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.925171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.925616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.928559 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.940814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.951800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.976124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.976123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c74j8" event={"ID":"135e322f-177c-4bbb-bb3d-0ab19eba6f92","Type":"ContainerDied","Data":"7997aceefc0685263d40f8dcf307e7e1ba659185b7b47b791ff788b49578f153"} Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.976294 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7997aceefc0685263d40f8dcf307e7e1ba659185b7b47b791ff788b49578f153" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.985859 4795 generic.go:334] "Generic (PLEG): container finished" podID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerID="5d7bc73635aabb988bf8536345a728b4481d0d64844c13a801df663371346492" exitCode=0 Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.985903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerDied","Data":"5d7bc73635aabb988bf8536345a728b4481d0d64844c13a801df663371346492"} Mar 20 17:37:02 crc kubenswrapper[4795]: I0320 17:37:02.519452 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:37:02 crc kubenswrapper[4795]: W0320 17:37:02.520422 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode951c331_872c_41b6_b747_d5129b8c0a1b.slice/crio-846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e WatchSource:0}: Error finding container 846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e: Status 404 returned error can't find the container with id 846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e Mar 20 17:37:02 crc kubenswrapper[4795]: I0320 17:37:02.992658 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerStarted","Data":"846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e"} Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.312011 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441667 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.442904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.443148 4795 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.443714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.453265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz" (OuterVolumeSpecName: "kube-api-access-lxsvz") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "kube-api-access-lxsvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.458555 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.464307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts" (OuterVolumeSpecName: "scripts") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.469303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.470943 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545456 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545498 4795 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545509 4795 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545522 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545535 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545547 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.002104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerDied","Data":"97fb6267b98d2b148faab38e0f46037ac0fdd70749948e0fa5391492bff624c1"} Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.002161 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97fb6267b98d2b148faab38e0f46037ac0fdd70749948e0fa5391492bff624c1" Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.002172 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.323616 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.329202 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:37:05 crc kubenswrapper[4795]: I0320 17:37:05.265878 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" path="/var/lib/kubelet/pods/135e322f-177c-4bbb-bb3d-0ab19eba6f92/volumes" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.437048 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dnp2g" podUID="28df10bb-d6a9-47a9-9b79-0bb9665529ef" containerName="ovn-controller" probeResult="failure" output=< Mar 20 17:37:06 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 17:37:06 crc kubenswrapper[4795]: > Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.447300 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.481098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.690222 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:06 crc kubenswrapper[4795]: E0320 17:37:06.690931 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerName="swift-ring-rebalance" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.690949 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerName="swift-ring-rebalance" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.691140 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerName="swift-ring-rebalance" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.691808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.694297 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.697271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732707 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.733043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.733082 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834905 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.835418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.835468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.835501 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.836939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.836987 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.865058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:07 crc kubenswrapper[4795]: I0320 17:37:07.016766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:07 crc kubenswrapper[4795]: I0320 17:37:07.459230 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:07 crc kubenswrapper[4795]: W0320 17:37:07.468284 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod170d948e_372e_4b54_8ecf_c370d4b10acb.slice/crio-e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c WatchSource:0}: Error finding container e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c: Status 404 returned error can't find the container with id e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c Mar 20 17:37:08 crc kubenswrapper[4795]: I0320 17:37:08.040775 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerStarted","Data":"b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb"} Mar 20 17:37:08 crc kubenswrapper[4795]: I0320 17:37:08.041032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerStarted","Data":"e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c"} Mar 20 17:37:08 crc kubenswrapper[4795]: I0320 17:37:08.064012 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dnp2g-config-q9gtk" podStartSLOduration=2.063983908 podStartE2EDuration="2.063983908s" podCreationTimestamp="2026-03-20 17:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:08.06049098 +0000 UTC m=+1171.518522531" watchObservedRunningTime="2026-03-20 17:37:08.063983908 +0000 UTC m=+1171.522015449" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.054160 4795 generic.go:334] "Generic (PLEG): container finished" podID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerID="b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb" exitCode=0 Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.054244 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerDied","Data":"b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb"} Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.322498 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.324183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.326140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.330630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.390409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.390488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.492821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.492900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.495042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.516033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.646166 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:11 crc kubenswrapper[4795]: I0320 17:37:11.441190 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dnp2g" Mar 20 17:37:14 crc kubenswrapper[4795]: I0320 17:37:14.895316 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.001967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016079 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016214 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016288 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.002084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run" (OuterVolumeSpecName: "var-run") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016822 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016837 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.017295 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.017815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts" (OuterVolumeSpecName: "scripts") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.019912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s" (OuterVolumeSpecName: "kube-api-access-kb27s") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "kube-api-access-kb27s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.110039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerDied","Data":"e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c"} Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.110084 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.110099 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118294 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118494 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118635 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118799 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.316051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:37:15 crc kubenswrapper[4795]: W0320 17:37:15.321468 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc63f125_2d90_43df_a863_b85fb2eb690e.slice/crio-261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab WatchSource:0}: Error finding container 261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab: Status 404 returned error can't find the container with id 261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.009317 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.019043 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.120132 4795 generic.go:334] "Generic (PLEG): container finished" podID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerID="497d569160d86f7ef365c3f9c537432bd00933f71438ea39707377d46eebd046" exitCode=0 Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.120219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ffs" event={"ID":"fc63f125-2d90-43df-a863-b85fb2eb690e","Type":"ContainerDied","Data":"497d569160d86f7ef365c3f9c537432bd00933f71438ea39707377d46eebd046"} Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.120249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ffs" event={"ID":"fc63f125-2d90-43df-a863-b85fb2eb690e","Type":"ContainerStarted","Data":"261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab"} Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.122024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerStarted","Data":"470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865"} Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.136493 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:16 crc kubenswrapper[4795]: E0320 17:37:16.137346 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerName="ovn-config" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.137379 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerName="ovn-config" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.137636 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerName="ovn-config" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.138422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.140693 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.167079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.201778 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sv5fz" podStartSLOduration=2.824677256 podStartE2EDuration="15.20175712s" podCreationTimestamp="2026-03-20 17:37:01 +0000 UTC" firstStartedPulling="2026-03-20 17:37:02.523200991 +0000 UTC m=+1165.981232532" lastFinishedPulling="2026-03-20 17:37:14.900280855 +0000 UTC m=+1178.358312396" observedRunningTime="2026-03-20 17:37:16.200664167 +0000 UTC m=+1179.658695718" watchObservedRunningTime="2026-03-20 17:37:16.20175712 +0000 UTC m=+1179.659788681" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240798 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.343368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.343756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.343984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.344165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.345895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.350382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.382028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.458100 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.524887 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.971618 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.134080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-2smjm" event={"ID":"846c5a67-a071-48ed-a9e7-67c62882835c","Type":"ContainerStarted","Data":"3bcfbfed1148451d8b161514050ae2bcb87b08356c696431428b3c727102e649"} Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.152587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:37:17 crc kubenswrapper[4795]: W0320 17:37:17.159581 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e955e5_ba7a_4582_9d52_40333fe21b7f.slice/crio-d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873 WatchSource:0}: Error finding container d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873: Status 404 returned error can't find the container with id d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873 Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.261292 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" path="/var/lib/kubelet/pods/170d948e-372e-4b54-8ecf-c370d4b10acb/volumes" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.417978 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.466358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"fc63f125-2d90-43df-a863-b85fb2eb690e\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.466467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"fc63f125-2d90-43df-a863-b85fb2eb690e\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.467235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc63f125-2d90-43df-a863-b85fb2eb690e" (UID: "fc63f125-2d90-43df-a863-b85fb2eb690e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.472158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj" (OuterVolumeSpecName: "kube-api-access-l4dsj") pod "fc63f125-2d90-43df-a863-b85fb2eb690e" (UID: "fc63f125-2d90-43df-a863-b85fb2eb690e"). InnerVolumeSpecName "kube-api-access-l4dsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.568150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.568202 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.150351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873"} Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.153611 4795 generic.go:334] "Generic (PLEG): container finished" podID="846c5a67-a071-48ed-a9e7-67c62882835c" containerID="9df2c204a93d51c554ceaf159d1f9366b95bed6cc7f2757ae8ae8edae396f498" exitCode=0 Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.153783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-2smjm" event={"ID":"846c5a67-a071-48ed-a9e7-67c62882835c","Type":"ContainerDied","Data":"9df2c204a93d51c554ceaf159d1f9366b95bed6cc7f2757ae8ae8edae396f498"} Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.157523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ffs" event={"ID":"fc63f125-2d90-43df-a863-b85fb2eb690e","Type":"ContainerDied","Data":"261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab"} Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.157568 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab" Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.157655 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.172518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"e389a8ec0a888e7cfab4b7ccfee9700e454039da49b040fb106aef2e1974ba85"} Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.173061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"4e4d933685c4a127946eb2e1564e691c50d0f7065bc81b37becfeb97ca12f56e"} Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.173076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d73a955953d52c7f3cfb477122af22bc0efcd99762b19d80302a0895e38e12e1"} Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.412787 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522212 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522431 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522571 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522631 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522956 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522829 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run" (OuterVolumeSpecName: "var-run") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523845 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523912 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523934 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523951 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.524450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts" (OuterVolumeSpecName: "scripts") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.527552 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6" (OuterVolumeSpecName: "kube-api-access-fc8l6") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "kube-api-access-fc8l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.625906 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.625953 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.186676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"95b2b207073495de5bf016766565d4264f7643e8986bca31d5a886381b8a2a45"} Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.190768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-2smjm" event={"ID":"846c5a67-a071-48ed-a9e7-67c62882835c","Type":"ContainerDied","Data":"3bcfbfed1148451d8b161514050ae2bcb87b08356c696431428b3c727102e649"} Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.190802 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bcfbfed1148451d8b161514050ae2bcb87b08356c696431428b3c727102e649" Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.190858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.489354 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.496852 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.206019 4795 generic.go:334] "Generic (PLEG): container finished" podID="b8103489-e552-49b0-a32a-1069a46feff9" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" exitCode=0 Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.206175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerDied","Data":"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157"} Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.218020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"94fe70416360f160e0eeed043b9070653346046927e4bbd473bf4701703bed88"} Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.218106 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d41110d0dd61954d11b2ddc28932de3ae795d26e4d019e191f0a7561f5b0b71d"} Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.260531 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" path="/var/lib/kubelet/pods/846c5a67-a071-48ed-a9e7-67c62882835c/volumes" Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.235953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerStarted","Data":"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.236705 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.239185 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerID="5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea" exitCode=0 Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.239256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerDied","Data":"5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.244444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"f269d387c50769b37a9d2ea4c757b330becf11be54c6f550a125f2c2bb94b986"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.244497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"516ad9c443197cc9e8165a78721fae46f2a7508b0cdfea8e94ef814847317bad"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.267772 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.34648388 podStartE2EDuration="1m26.267754148s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.796273767 +0000 UTC m=+1114.254305318" lastFinishedPulling="2026-03-20 17:36:47.717544005 +0000 UTC m=+1151.175575586" observedRunningTime="2026-03-20 17:37:22.262726242 +0000 UTC m=+1185.720757803" watchObservedRunningTime="2026-03-20 17:37:22.267754148 +0000 UTC m=+1185.725785689" Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.260975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"92ead5f56ac7347c810dd90b1560b35e81836d7f359a796b46a30a7c6c019707"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.261021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d4618223e960efe7ce48a402fa7f9f6d73b91babeb1e98beebab62ac5781f4dc"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.263803 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerStarted","Data":"930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.263971 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.266486 4795 generic.go:334] "Generic (PLEG): container finished" podID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerID="470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865" exitCode=0 Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.266586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerDied","Data":"470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.288852 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371949.56594 podStartE2EDuration="1m27.288835305s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.998998659 +0000 UTC m=+1114.457030200" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:23.285323915 +0000 UTC m=+1186.743355466" watchObservedRunningTime="2026-03-20 17:37:23.288835305 +0000 UTC m=+1186.746866856" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.281033 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"cf71f961fcc91b9e35fa81659d3f697bfb918bf0cbf6d4d3d897add69350217e"} Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.281366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"7dd41f0c40aeeab4fb0e2dda77fcf2039ae08488c188c3ac01c40ecd248916e7"} Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.656956 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.724001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.728988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227" (OuterVolumeSpecName: "kube-api-access-pz227") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "kube-api-access-pz227". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.752634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.754218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data" (OuterVolumeSpecName: "config-data") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810885 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810920 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810932 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810944 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.288199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerDied","Data":"846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e"} Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.288536 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.288603 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.801549 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:25 crc kubenswrapper[4795]: E0320 17:37:25.801964 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerName="glance-db-sync" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802004 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerName="glance-db-sync" Mar 20 17:37:25 crc kubenswrapper[4795]: E0320 17:37:25.802028 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerName="mariadb-account-create-update" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerName="mariadb-account-create-update" Mar 20 17:37:25 crc kubenswrapper[4795]: E0320 17:37:25.802054 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" containerName="ovn-config" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802062 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" containerName="ovn-config" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802239 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerName="mariadb-account-create-update" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802263 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" containerName="ovn-config" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerName="glance-db-sync" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.803254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.819993 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829885 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.931942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933244 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.950546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:26 crc kubenswrapper[4795]: I0320 17:37:26.117549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:26 crc kubenswrapper[4795]: I0320 17:37:26.744333 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:27 crc kubenswrapper[4795]: I0320 17:37:27.303837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerStarted","Data":"aa3f1672f9f6d3489df066685a31982534a33df447f9e485fd30c0b9c6ecc887"} Mar 20 17:37:27 crc kubenswrapper[4795]: I0320 17:37:27.310260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"bf3ce42cae265bb99fb791f8eed17095fce85d87cfb2f7c86fb8f9b0bfe58bf9"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.319738 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1a56dca-ff60-46df-8582-70547b180198" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" exitCode=0 Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.319822 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerDied","Data":"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.328663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"f9b19475d31f465754f36521ef65528e71a402fa42bc145e49d11499c802e865"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.328724 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"2aac8e6c32a191322b5f706bcb756b20123a5381decfbbcdfa8f3b0ec3eb505b"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.423405 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.762671026 podStartE2EDuration="45.423390706s" podCreationTimestamp="2026-03-20 17:36:43 +0000 UTC" firstStartedPulling="2026-03-20 17:37:17.162393946 +0000 UTC m=+1180.620425487" lastFinishedPulling="2026-03-20 17:37:22.823113586 +0000 UTC m=+1186.281145167" observedRunningTime="2026-03-20 17:37:28.421894619 +0000 UTC m=+1191.879926170" watchObservedRunningTime="2026-03-20 17:37:28.423390706 +0000 UTC m=+1191.881422247" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.682505 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.709659 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.711188 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.722053 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.733372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779956 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.780038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.882008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.882832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.883304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.884059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.884533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.885058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.910329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.027340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.337848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerStarted","Data":"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683"} Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.338120 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.355161 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" podStartSLOduration=4.355140811 podStartE2EDuration="4.355140811s" podCreationTimestamp="2026-03-20 17:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:29.351979173 +0000 UTC m=+1192.810010714" watchObservedRunningTime="2026-03-20 17:37:29.355140811 +0000 UTC m=+1192.813172352" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.574362 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.348877 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed82521a-8a78-4611-870f-5ad53625bddf" containerID="99144a28a69d3c6fb3d096bda447f3ff1d42028233b58205725db4c507c27464" exitCode=0 Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.348937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerDied","Data":"99144a28a69d3c6fb3d096bda447f3ff1d42028233b58205725db4c507c27464"} Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.349403 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" containerID="cri-o://3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" gracePeriod=10 Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.349458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerStarted","Data":"4bbbefbb238ecd4d186023f5577494b794cb2bb3b7f5edef795d647b067a660b"} Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.743029 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.911211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912015 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.919759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd" (OuterVolumeSpecName: "kube-api-access-m2svd") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "kube-api-access-m2svd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.970464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.972362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.976658 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config" (OuterVolumeSpecName: "config") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.977738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015281 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015334 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015346 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015359 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015370 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.360010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerStarted","Data":"46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481"} Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.360731 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363097 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1a56dca-ff60-46df-8582-70547b180198" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" exitCode=0 Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363136 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerDied","Data":"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683"} Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerDied","Data":"aa3f1672f9f6d3489df066685a31982534a33df447f9e485fd30c0b9c6ecc887"} Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363173 4795 scope.go:117] "RemoveContainer" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363212 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.381371 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podStartSLOduration=3.381350408 podStartE2EDuration="3.381350408s" podCreationTimestamp="2026-03-20 17:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:31.379659906 +0000 UTC m=+1194.837691537" watchObservedRunningTime="2026-03-20 17:37:31.381350408 +0000 UTC m=+1194.839381979" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.457830 4795 scope.go:117] "RemoveContainer" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.463759 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.467311 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.494584 4795 scope.go:117] "RemoveContainer" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" Mar 20 17:37:31 crc kubenswrapper[4795]: E0320 17:37:31.494985 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683\": container with ID starting with 3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683 not found: ID does not exist" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.495026 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683"} err="failed to get container status \"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683\": rpc error: code = NotFound desc = could not find container \"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683\": container with ID starting with 3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683 not found: ID does not exist" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.495053 4795 scope.go:117] "RemoveContainer" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" Mar 20 17:37:31 crc kubenswrapper[4795]: E0320 17:37:31.495458 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7\": container with ID starting with 2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7 not found: ID does not exist" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.495480 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7"} err="failed to get container status \"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7\": rpc error: code = NotFound desc = could not find container \"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7\": container with ID starting with 2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7 not found: ID does not exist" Mar 20 17:37:33 crc kubenswrapper[4795]: I0320 17:37:33.264017 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a56dca-ff60-46df-8582-70547b180198" path="/var/lib/kubelet/pods/c1a56dca-ff60-46df-8582-70547b180198/volumes" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.502041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.834904 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.835823 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:37:37 crc kubenswrapper[4795]: E0320 17:37:37.836140 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="init" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836156 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="init" Mar 20 17:37:37 crc kubenswrapper[4795]: E0320 17:37:37.836176 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836183 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836338 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.852437 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.937993 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.938074 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.987515 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.988469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.991470 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.004383 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.040776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.040824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.041489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.112070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.143526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.143598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.144711 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.146064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.161778 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.168588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.223200 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.224325 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.247485 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.247801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.247919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.248006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.248791 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.249118 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.250090 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.257305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.257960 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.268833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.283997 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.284918 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.286522 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.287045 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.287463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.287668 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.294635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.294725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.302279 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.349637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.349670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.349722 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.350131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.350258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.350302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.360549 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.368396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.438286 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.440030 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.443056 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.452394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.453179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.457706 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.466841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.471004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.479038 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.505540 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.554521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.554962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.555025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.555050 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.555094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.559953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.561836 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.575448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.656150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.656227 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.656981 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.668807 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.671900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.680808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.688669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.763106 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.788939 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:37:38 crc kubenswrapper[4795]: W0320 17:37:38.808409 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff9ec79_6bd9_470e_8a75_8df1f3c52851.slice/crio-5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b WatchSource:0}: Error finding container 5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b: Status 404 returned error can't find the container with id 5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.911523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:37:38 crc kubenswrapper[4795]: W0320 17:37:38.935392 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode065e2d4_096b_426b_a1f8_14311adb7cbc.slice/crio-b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984 WatchSource:0}: Error finding container b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984: Status 404 returned error can't find the container with id b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.027809 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.112075 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.112322 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-8grln" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" containerID="cri-o://7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc" gracePeriod=10 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.149824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.212703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.221967 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:37:39 crc kubenswrapper[4795]: W0320 17:37:39.272018 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b15c724_622b_4da7_96a3_01949d04ecac.slice/crio-aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565 WatchSource:0}: Error finding container aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565: Status 404 returned error can't find the container with id aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.401786 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:37:39 crc kubenswrapper[4795]: W0320 17:37:39.451924 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c0847a4_54b5_4068_bfa8_730a19e96d9c.slice/crio-c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9 WatchSource:0}: Error finding container c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9: Status 404 returned error can't find the container with id c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.454728 4795 generic.go:334] "Generic (PLEG): container finished" podID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerID="7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.454776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerDied","Data":"7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.459020 4795 generic.go:334] "Generic (PLEG): container finished" podID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerID="0bd7daab0116804ff4450e365b81c7b208a20da2cd4b665ce83729724da32638" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.459087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2xqcf" event={"ID":"e065e2d4-096b-426b-a1f8-14311adb7cbc","Type":"ContainerDied","Data":"0bd7daab0116804ff4450e365b81c7b208a20da2cd4b665ce83729724da32638"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.459118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2xqcf" event={"ID":"e065e2d4-096b-426b-a1f8-14311adb7cbc","Type":"ContainerStarted","Data":"b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.461244 4795 generic.go:334] "Generic (PLEG): container finished" podID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerID="d85d02558764daff4d2300daa1f7a51dd79d0b89452cbb1821643bd1f3d0ff3c" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.461296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b5nks" event={"ID":"373ddf98-d9da-4f1f-a6be-3d16e3cbad57","Type":"ContainerDied","Data":"d85d02558764daff4d2300daa1f7a51dd79d0b89452cbb1821643bd1f3d0ff3c"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.461314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b5nks" event={"ID":"373ddf98-d9da-4f1f-a6be-3d16e3cbad57","Type":"ContainerStarted","Data":"01df350322d624589cd020be71f4012590e18461b49a99764038fdca3981f6bb"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.463145 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerID="0f5ef18005b655abcc8e4883b9bee8538648f3cf86fe68a6e17cb1ecb194c52e" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.463210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fe40-account-create-update-jh9t8" event={"ID":"9ff9ec79-6bd9-470e-8a75-8df1f3c52851","Type":"ContainerDied","Data":"0f5ef18005b655abcc8e4883b9bee8538648f3cf86fe68a6e17cb1ecb194c52e"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.463229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fe40-account-create-update-jh9t8" event={"ID":"9ff9ec79-6bd9-470e-8a75-8df1f3c52851","Type":"ContainerStarted","Data":"5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.464415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c60c-account-create-update-hhzrt" event={"ID":"36d71698-4dc2-448a-9330-23372e2d508b","Type":"ContainerStarted","Data":"27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.468120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerStarted","Data":"aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.469098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d5tx6" event={"ID":"18b1c5f0-e7fb-44b7-8c75-c8036f371c56","Type":"ContainerStarted","Data":"a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.557515 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672093 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672149 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672169 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.677411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4" (OuterVolumeSpecName: "kube-api-access-xmkt4") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "kube-api-access-xmkt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.719904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config" (OuterVolumeSpecName: "config") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.722524 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.724915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.729339 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774155 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774416 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774432 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774443 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774456 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.481084 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerID="e0dfbaccbeeb5b8f99fb5498e364810e7c89661123cf8b487af69c7d6020134e" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.481186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ac3-account-create-update-km4zq" event={"ID":"1c0847a4-54b5-4068-bfa8-730a19e96d9c","Type":"ContainerDied","Data":"e0dfbaccbeeb5b8f99fb5498e364810e7c89661123cf8b487af69c7d6020134e"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.481237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ac3-account-create-update-km4zq" event={"ID":"1c0847a4-54b5-4068-bfa8-730a19e96d9c","Type":"ContainerStarted","Data":"c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.483996 4795 generic.go:334] "Generic (PLEG): container finished" podID="36d71698-4dc2-448a-9330-23372e2d508b" containerID="3e3072d7a6a60ff440da8ec24082885e62958e6ce5ded9fd9910a3d0c2817a07" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.484063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c60c-account-create-update-hhzrt" event={"ID":"36d71698-4dc2-448a-9330-23372e2d508b","Type":"ContainerDied","Data":"3e3072d7a6a60ff440da8ec24082885e62958e6ce5ded9fd9910a3d0c2817a07"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.486436 4795 generic.go:334] "Generic (PLEG): container finished" podID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerID="5d693b98a616da996bc733e3508b576b31f68d8eb1c9fc7b9800283fac04b343" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.486501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d5tx6" event={"ID":"18b1c5f0-e7fb-44b7-8c75-c8036f371c56","Type":"ContainerDied","Data":"5d693b98a616da996bc733e3508b576b31f68d8eb1c9fc7b9800283fac04b343"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.490508 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.493721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerDied","Data":"f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.493820 4795 scope.go:117] "RemoveContainer" containerID="7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.531934 4795 scope.go:117] "RemoveContainer" containerID="a43865e4251904d08d5c0655d2fd65e83c2843454a9cfbf7734d7aa91dad11f3" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.568130 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.574672 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.883327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.969185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.973758 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.994647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.994737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.996162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ff9ec79-6bd9-470e-8a75-8df1f3c52851" (UID: "9ff9ec79-6bd9-470e-8a75-8df1f3c52851"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.004062 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv" (OuterVolumeSpecName: "kube-api-access-p6cxv") pod "9ff9ec79-6bd9-470e-8a75-8df1f3c52851" (UID: "9ff9ec79-6bd9-470e-8a75-8df1f3c52851"). InnerVolumeSpecName "kube-api-access-p6cxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"e065e2d4-096b-426b-a1f8-14311adb7cbc\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096807 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"e065e2d4-096b-426b-a1f8-14311adb7cbc\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096882 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097301 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097320 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e065e2d4-096b-426b-a1f8-14311adb7cbc" (UID: "e065e2d4-096b-426b-a1f8-14311adb7cbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097635 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "373ddf98-d9da-4f1f-a6be-3d16e3cbad57" (UID: "373ddf98-d9da-4f1f-a6be-3d16e3cbad57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.101506 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn" (OuterVolumeSpecName: "kube-api-access-tgnfn") pod "e065e2d4-096b-426b-a1f8-14311adb7cbc" (UID: "e065e2d4-096b-426b-a1f8-14311adb7cbc"). InnerVolumeSpecName "kube-api-access-tgnfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.102412 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn" (OuterVolumeSpecName: "kube-api-access-c7vqn") pod "373ddf98-d9da-4f1f-a6be-3d16e3cbad57" (UID: "373ddf98-d9da-4f1f-a6be-3d16e3cbad57"). InnerVolumeSpecName "kube-api-access-c7vqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.198965 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.199040 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.199050 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.199059 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.262817 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" path="/var/lib/kubelet/pods/601af69d-c03f-4bdf-b3bf-67ba791674f9/volumes" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.299643 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.299705 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.499957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2xqcf" event={"ID":"e065e2d4-096b-426b-a1f8-14311adb7cbc","Type":"ContainerDied","Data":"b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984"} Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.499999 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.500031 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.502345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b5nks" event={"ID":"373ddf98-d9da-4f1f-a6be-3d16e3cbad57","Type":"ContainerDied","Data":"01df350322d624589cd020be71f4012590e18461b49a99764038fdca3981f6bb"} Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.502389 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01df350322d624589cd020be71f4012590e18461b49a99764038fdca3981f6bb" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.502393 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.504292 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.504286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fe40-account-create-update-jh9t8" event={"ID":"9ff9ec79-6bd9-470e-8a75-8df1f3c52851","Type":"ContainerDied","Data":"5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b"} Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.504479 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b" Mar 20 17:37:43 crc kubenswrapper[4795]: I0320 17:37:43.937087 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:43 crc kubenswrapper[4795]: I0320 17:37:43.942807 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:43 crc kubenswrapper[4795]: I0320 17:37:43.948080 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.066912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"36d71698-4dc2-448a-9330-23372e2d508b\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.066988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067080 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067122 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067161 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"36d71698-4dc2-448a-9330-23372e2d508b\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067400 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36d71698-4dc2-448a-9330-23372e2d508b" (UID: "36d71698-4dc2-448a-9330-23372e2d508b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067881 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18b1c5f0-e7fb-44b7-8c75-c8036f371c56" (UID: "18b1c5f0-e7fb-44b7-8c75-c8036f371c56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c0847a4-54b5-4068-bfa8-730a19e96d9c" (UID: "1c0847a4-54b5-4068-bfa8-730a19e96d9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.071134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq" (OuterVolumeSpecName: "kube-api-access-6hdwq") pod "18b1c5f0-e7fb-44b7-8c75-c8036f371c56" (UID: "18b1c5f0-e7fb-44b7-8c75-c8036f371c56"). InnerVolumeSpecName "kube-api-access-6hdwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.071165 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4" (OuterVolumeSpecName: "kube-api-access-l97q4") pod "1c0847a4-54b5-4068-bfa8-730a19e96d9c" (UID: "1c0847a4-54b5-4068-bfa8-730a19e96d9c"). InnerVolumeSpecName "kube-api-access-l97q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.072914 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r" (OuterVolumeSpecName: "kube-api-access-t5j4r") pod "36d71698-4dc2-448a-9330-23372e2d508b" (UID: "36d71698-4dc2-448a-9330-23372e2d508b"). InnerVolumeSpecName "kube-api-access-t5j4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169329 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169729 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169743 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169754 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169764 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.544583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerStarted","Data":"5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.548947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d5tx6" event={"ID":"18b1c5f0-e7fb-44b7-8c75-c8036f371c56","Type":"ContainerDied","Data":"a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.549070 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.549265 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.559309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ac3-account-create-update-km4zq" event={"ID":"1c0847a4-54b5-4068-bfa8-730a19e96d9c","Type":"ContainerDied","Data":"c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.559354 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.559429 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.575487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c60c-account-create-update-hhzrt" event={"ID":"36d71698-4dc2-448a-9330-23372e2d508b","Type":"ContainerDied","Data":"27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.575553 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.575595 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.580102 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dwhh5" podStartSLOduration=1.915128879 podStartE2EDuration="6.580073701s" podCreationTimestamp="2026-03-20 17:37:38 +0000 UTC" firstStartedPulling="2026-03-20 17:37:39.274786895 +0000 UTC m=+1202.732818436" lastFinishedPulling="2026-03-20 17:37:43.939731717 +0000 UTC m=+1207.397763258" observedRunningTime="2026-03-20 17:37:44.572085392 +0000 UTC m=+1208.030117003" watchObservedRunningTime="2026-03-20 17:37:44.580073701 +0000 UTC m=+1208.038105272" Mar 20 17:37:44 crc kubenswrapper[4795]: E0320 17:37:44.790095 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c0847a4_54b5_4068_bfa8_730a19e96d9c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b1c5f0_e7fb_44b7_8c75_c8036f371c56.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d71698_4dc2_448a_9330_23372e2d508b.slice/crio-27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d71698_4dc2_448a_9330_23372e2d508b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b1c5f0_e7fb_44b7_8c75_c8036f371c56.slice/crio-a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736\": RecentStats: unable to find data in memory cache]" Mar 20 17:37:47 crc kubenswrapper[4795]: I0320 17:37:47.612022 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b15c724-622b-4da7-96a3-01949d04ecac" containerID="5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0" exitCode=0 Mar 20 17:37:47 crc kubenswrapper[4795]: I0320 17:37:47.612135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerDied","Data":"5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0"} Mar 20 17:37:48 crc kubenswrapper[4795]: I0320 17:37:48.953915 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.056329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"7b15c724-622b-4da7-96a3-01949d04ecac\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.056503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"7b15c724-622b-4da7-96a3-01949d04ecac\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.056596 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"7b15c724-622b-4da7-96a3-01949d04ecac\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.061512 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj" (OuterVolumeSpecName: "kube-api-access-6lmwj") pod "7b15c724-622b-4da7-96a3-01949d04ecac" (UID: "7b15c724-622b-4da7-96a3-01949d04ecac"). InnerVolumeSpecName "kube-api-access-6lmwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.082962 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b15c724-622b-4da7-96a3-01949d04ecac" (UID: "7b15c724-622b-4da7-96a3-01949d04ecac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.109084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data" (OuterVolumeSpecName: "config-data") pod "7b15c724-622b-4da7-96a3-01949d04ecac" (UID: "7b15c724-622b-4da7-96a3-01949d04ecac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.159461 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.159529 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.159558 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.636811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerDied","Data":"aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565"} Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.636856 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.636876 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865024 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865430 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" containerName="keystone-db-sync" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865452 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" containerName="keystone-db-sync" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865479 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d71698-4dc2-448a-9330-23372e2d508b" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865487 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d71698-4dc2-448a-9330-23372e2d508b" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865504 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865518 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865526 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865542 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865549 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865561 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="init" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865569 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="init" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865579 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865587 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865604 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865611 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865623 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865630 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865945 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" containerName="keystone-db-sync" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865975 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865990 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866005 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866013 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d71698-4dc2-448a-9330-23372e2d508b" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866021 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866030 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.867053 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.885347 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.909046 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.910198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.912108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918216 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918640 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.943751 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974829 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974922 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.975006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.053306 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.063268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073235 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073486 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mdfb4" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073604 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073717 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076643 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076764 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076785 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.077615 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.078122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.088412 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.091515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.091953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.092104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.097983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.098222 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.098506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.099480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.103262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.107401 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.110251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.113381 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qbkvb" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.113703 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.113868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.126865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.132320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.133937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179722 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.188067 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.188626 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.205496 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.207153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.208880 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.217268 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.218473 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.223734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m5c4m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.223893 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.224098 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.224276 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qdq8q" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.226863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.246640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.247950 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.280460 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.281363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.282350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289393 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289593 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290212 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290974 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.291058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.293094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.296130 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.297881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.298008 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.299010 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.306023 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.311385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.330550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.337366 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.347228 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.348191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.353330 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.353557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qhmpx" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393998 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394153 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394368 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.397552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.399141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.405428 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.406914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.408249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.418785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.419392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.425899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.426340 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.433430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.433665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.451955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.459297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.497364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.497593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.498144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.499578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.499644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.520455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.520524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.521052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.546021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.552323 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.595628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.598200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.602874 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.603125 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604603 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.609554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.610245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.613318 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.627941 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.629871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.631516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632058 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s6lrt" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632225 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.656134 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.656436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.671024 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.697181 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706840 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707093 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.709039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.709292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.718148 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.719318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.743621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808853 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808922 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.811300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.820624 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.820895 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.822312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.822601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.822660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.825991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.843458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.846199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.848852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.855456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.855667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.856320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.856862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.857089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.872600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.867667 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.878761 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.909432 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.921087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.963342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.125237 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.209600 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.214202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.217554 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.217759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.229646 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.251110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:51 crc kubenswrapper[4795]: W0320 17:37:51.297218 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1074ea_5432_46f8_ba74_7c68912c68b6.slice/crio-715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6 WatchSource:0}: Error finding container 715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6: Status 404 returned error can't find the container with id 715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320600 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.321011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422970 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422967 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.423030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.423050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.423772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.424046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.430993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.431199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.432164 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.435055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.446444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.519671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.533296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.706079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.714134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerStarted","Data":"43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.714179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerStarted","Data":"c771cb8aaa4f06cd374656dadf93993f0cd60baafd23685f3873cdc0a25a81b2"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.721610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerStarted","Data":"788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.721664 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerStarted","Data":"6a73d63fdc0ea281653f757d79fef4c1bc3b9c3ec1ea1387b57d51713a411b61"} Mar 20 17:37:51 crc kubenswrapper[4795]: W0320 17:37:51.736112 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706c47a0_7763_44af_9b14_0e5322a8f2f1.slice/crio-d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5 WatchSource:0}: Error finding container d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5: Status 404 returned error can't find the container with id d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.736613 4795 generic.go:334] "Generic (PLEG): container finished" podID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerID="2241caa74db1d0488eb5dd2d754e106b2fc62040b64e0030ffbe349d4b865937" exitCode=0 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.736667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" event={"ID":"48116dbd-882f-4c5e-a8fe-4bea9195e73b","Type":"ContainerDied","Data":"2241caa74db1d0488eb5dd2d754e106b2fc62040b64e0030ffbe349d4b865937"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.736705 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" event={"ID":"48116dbd-882f-4c5e-a8fe-4bea9195e73b","Type":"ContainerStarted","Data":"bf06f2e3ef91158153acc1706941dba305656126f07ac38a5867119c1c7f7e8b"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.741287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerStarted","Data":"715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.755232 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.765824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.767385 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7mx5b" podStartSLOduration=1.767358215 podStartE2EDuration="1.767358215s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:51.741334384 +0000 UTC m=+1215.199365915" watchObservedRunningTime="2026-03-20 17:37:51.767358215 +0000 UTC m=+1215.225389756" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.789393 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:37:51 crc kubenswrapper[4795]: W0320 17:37:51.811871 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79c1ee6_f8b4_485c_ac9e_667a09868206.slice/crio-f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1 WatchSource:0}: Error finding container f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1: Status 404 returned error can't find the container with id f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.827596 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.851667 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6kxnf" podStartSLOduration=2.851625387 podStartE2EDuration="2.851625387s" podCreationTimestamp="2026-03-20 17:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:51.772104842 +0000 UTC m=+1215.230136393" watchObservedRunningTime="2026-03-20 17:37:51.851625387 +0000 UTC m=+1215.309656948" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.928153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.052053 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.078876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.107917 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.156519 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.168142 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.169797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.223398 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.265873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.267191 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370534 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.371040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.373099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.374423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.388430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.416058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.419302 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.520093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.541967 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581346 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.590614 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk" (OuterVolumeSpecName: "kube-api-access-qjtmk") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "kube-api-access-qjtmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.608876 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config" (OuterVolumeSpecName: "config") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.609704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.610497 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.616076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.622275 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689657 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689937 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689947 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689959 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689968 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689975 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.755506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"d2d85da431f6c738cb4aa9ee890ff4d70deedc277e0a8410951bda0e019d69a8"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.757248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f84bdc6f9-rj454" event={"ID":"c2e1702a-166c-4c2e-9c39-d32a62528a89","Type":"ContainerStarted","Data":"49b0337f706e9262134a18e9a4347becf3d1ef94349c1c8c94395df30d906dd2"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.760052 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerStarted","Data":"eaa9eee2e882516c5d4ae5df7684d52bf42c7eec92e061674b1b8ad393538f60"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.764054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" event={"ID":"48116dbd-882f-4c5e-a8fe-4bea9195e73b","Type":"ContainerDied","Data":"bf06f2e3ef91158153acc1706941dba305656126f07ac38a5867119c1c7f7e8b"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.764155 4795 scope.go:117] "RemoveContainer" containerID="2241caa74db1d0488eb5dd2d754e106b2fc62040b64e0030ffbe349d4b865937" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.764288 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.781308 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerStarted","Data":"d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.796032 4795 generic.go:334] "Generic (PLEG): container finished" podID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerID="65673e010192e9af6a054b2e6fafb5d1f1505b377d27e64bdbfd06c2c8d1a1c2" exitCode=0 Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.796185 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerDied","Data":"65673e010192e9af6a054b2e6fafb5d1f1505b377d27e64bdbfd06c2c8d1a1c2"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.796212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerStarted","Data":"f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.800113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerStarted","Data":"5feb2d488b80d85ec78d6982b03a6895822f924d5e7cd9f2b1dd64d4c4e88e67"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.807440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerStarted","Data":"88a328785f37d83e7d6391b28b27d2d2c6fdbb3c3985b1505819228d225ea6fa"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.813151 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerStarted","Data":"a9bb6c0d2645cf89398ba5e524f48e4c91c8aba913f48a27e0f35d07ec1b7929"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.852661 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.861061 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.172061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.274955 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" path="/var/lib/kubelet/pods/48116dbd-882f-4c5e-a8fe-4bea9195e73b/volumes" Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.825440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerStarted","Data":"02a106435121a29bb7e883006bb45d54dcf75dcebd8e8d213a1788cfe4f4db42"} Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.828256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerStarted","Data":"fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18"} Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.829227 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.830959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerStarted","Data":"8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46"} Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.849796 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" podStartSLOduration=3.849772281 podStartE2EDuration="3.849772281s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:53.842862726 +0000 UTC m=+1217.300894257" watchObservedRunningTime="2026-03-20 17:37:53.849772281 +0000 UTC m=+1217.307803822" Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.863964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerStarted","Data":"17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491"} Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.916026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerStarted","Data":"806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc"} Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.916804 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" containerID="cri-o://8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46" gracePeriod=30 Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.917206 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" containerID="cri-o://806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc" gracePeriod=30 Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.976905 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.976880158 podStartE2EDuration="4.976880158s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:54.963981517 +0000 UTC m=+1218.422013078" watchObservedRunningTime="2026-03-20 17:37:54.976880158 +0000 UTC m=+1218.434911699" Mar 20 17:37:55 crc kubenswrapper[4795]: E0320 17:37:55.040737 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6db155e7_c2bd_430d_b59f_895fce359c51.slice/crio-806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6db155e7_c2bd_430d_b59f_895fce359c51.slice/crio-conmon-806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.971199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerStarted","Data":"9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7"} Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.971557 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" containerID="cri-o://17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491" gracePeriod=30 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.972012 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" containerID="cri-o://9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7" gracePeriod=30 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.997735 4795 generic.go:334] "Generic (PLEG): container finished" podID="6db155e7-c2bd-430d-b59f-895fce359c51" containerID="806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc" exitCode=143 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.998085 4795 generic.go:334] "Generic (PLEG): container finished" podID="6db155e7-c2bd-430d-b59f-895fce359c51" containerID="8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46" exitCode=143 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.997819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerDied","Data":"806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc"} Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.998148 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerDied","Data":"8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46"} Mar 20 17:37:56 crc kubenswrapper[4795]: I0320 17:37:56.038901 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.038878439 podStartE2EDuration="6.038878439s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:56.011998942 +0000 UTC m=+1219.470030493" watchObservedRunningTime="2026-03-20 17:37:56.038878439 +0000 UTC m=+1219.496909980" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.011805 4795 generic.go:334] "Generic (PLEG): container finished" podID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerID="788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c" exitCode=0 Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.011906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerDied","Data":"788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c"} Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.015948 4795 generic.go:334] "Generic (PLEG): container finished" podID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerID="9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7" exitCode=0 Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.015980 4795 generic.go:334] "Generic (PLEG): container finished" podID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerID="17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491" exitCode=143 Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.016007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerDied","Data":"9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7"} Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.016036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerDied","Data":"17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491"} Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.601664 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704239 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704423 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704544 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704937 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704954 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs" (OuterVolumeSpecName: "logs") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.712183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.712863 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5" (OuterVolumeSpecName: "kube-api-access-l6mx5") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "kube-api-access-l6mx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.725331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts" (OuterVolumeSpecName: "scripts") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.745974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.760892 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.772865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data" (OuterVolumeSpecName: "config-data") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805882 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805910 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805920 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805929 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805937 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805970 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805978 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805986 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.821938 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.907210 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.029092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerDied","Data":"a9bb6c0d2645cf89398ba5e524f48e4c91c8aba913f48a27e0f35d07ec1b7929"} Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.029103 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.029353 4795 scope.go:117] "RemoveContainer" containerID="806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.068490 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.079171 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.091138 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: E0320 17:37:58.093635 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093701 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" Mar 20 17:37:58 crc kubenswrapper[4795]: E0320 17:37:58.093725 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerName="init" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093736 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerName="init" Mar 20 17:37:58 crc kubenswrapper[4795]: E0320 17:37:58.093771 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093779 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093985 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerName="init" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.094012 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.094026 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.095102 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.097427 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.098208 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.107264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.211902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.211955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212039 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212217 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316354 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316486 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.318446 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.319250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.327196 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.333439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.333844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.334891 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.335920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.355365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.404908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.439146 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.452718 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.453993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.457707 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.468359 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.516309 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520098 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.549460 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7fb74ddb8-dbrvh"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.560134 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb74ddb8-dbrvh"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.560235 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.567738 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.568347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.622708 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.623390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.623854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.625698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.629776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.640285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.647327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-scripts\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723311 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-config-data\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-logs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-tls-certs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzvp\" (UniqueName: \"kubernetes.io/projected/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-kube-api-access-lvzvp\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-combined-ca-bundle\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-secret-key\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.776219 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.824871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-tls-certs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzvp\" (UniqueName: \"kubernetes.io/projected/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-kube-api-access-lvzvp\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-combined-ca-bundle\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-secret-key\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-scripts\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-config-data\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-logs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-logs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.827338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-scripts\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.828153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-config-data\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.830378 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-combined-ca-bundle\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.832334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-tls-certs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.832827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-secret-key\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.842342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzvp\" (UniqueName: \"kubernetes.io/projected/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-kube-api-access-lvzvp\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.902229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:59 crc kubenswrapper[4795]: I0320 17:37:59.264265 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" path="/var/lib/kubelet/pods/6db155e7-c2bd-430d-b59f-895fce359c51/volumes" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.134459 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.135644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.138018 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.138386 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.139556 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.165761 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.267126 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"auto-csr-approver-29567138-7flct\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.369470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"auto-csr-approver-29567138-7flct\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.405373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"auto-csr-approver-29567138-7flct\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.472119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.698926 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.749224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.749665 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" containerID="cri-o://46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481" gracePeriod=10 Mar 20 17:38:02 crc kubenswrapper[4795]: I0320 17:38:02.064123 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed82521a-8a78-4611-870f-5ad53625bddf" containerID="46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481" exitCode=0 Mar 20 17:38:02 crc kubenswrapper[4795]: I0320 17:38:02.064171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerDied","Data":"46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481"} Mar 20 17:38:04 crc kubenswrapper[4795]: I0320 17:38:04.028490 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Mar 20 17:38:07 crc kubenswrapper[4795]: E0320 17:38:07.623466 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:38:07 crc kubenswrapper[4795]: E0320 17:38:07.626094 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h7ch589h555h5fbhd7h97h587h55bh658hc4h656h5d6h656h5ddh669h95h5c6h5b7h598h5d6hd8h59dh646hfbhd6hb9h5d6h8fh5dch5cfh64q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdk4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f84bdc6f9-rj454_openstack(c2e1702a-166c-4c2e-9c39-d32a62528a89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:38:07 crc kubenswrapper[4795]: E0320 17:38:07.629649 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f84bdc6f9-rj454" podUID="c2e1702a-166c-4c2e-9c39-d32a62528a89" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.725164 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.731156 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922830 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923195 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923241 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs" (OuterVolumeSpecName: "logs") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923515 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.924179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.929758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.929762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb" (OuterVolumeSpecName: "kube-api-access-888xb") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "kube-api-access-888xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.929814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.931469 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts" (OuterVolumeSpecName: "scripts") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.931568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.946742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts" (OuterVolumeSpecName: "scripts") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.949398 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.951468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data" (OuterVolumeSpecName: "config-data") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.957232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2" (OuterVolumeSpecName: "kube-api-access-42fh2") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "kube-api-access-42fh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.961632 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.983837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.987679 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data" (OuterVolumeSpecName: "config-data") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025619 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025665 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025677 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025706 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025716 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025724 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025732 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025741 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025749 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025758 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025781 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025789 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025821 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.042009 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.121650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerDied","Data":"6a73d63fdc0ea281653f757d79fef4c1bc3b9c3ec1ea1387b57d51713a411b61"} Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.121729 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.121734 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a73d63fdc0ea281653f757d79fef4c1bc3b9c3ec1ea1387b57d51713a411b61" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.124649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerDied","Data":"5feb2d488b80d85ec78d6982b03a6895822f924d5e7cd9f2b1dd64d4c4e88e67"} Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.124814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.127240 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.177825 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.185487 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.201985 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: E0320 17:38:08.202293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" Mar 20 17:38:08 crc kubenswrapper[4795]: E0320 17:38:08.202317 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerName="keystone-bootstrap" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202323 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerName="keystone-bootstrap" Mar 20 17:38:08 crc kubenswrapper[4795]: E0320 17:38:08.202335 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202341 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202492 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202509 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerName="keystone-bootstrap" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.203302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.208412 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.208602 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.212010 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329979 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.330021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.330045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.330145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.431846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.431896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.431937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432854 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.433289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.433324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.433381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.436133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.436302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.436918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.438747 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.451051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.462637 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.523845 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.844328 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.851285 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.960819 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.962050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.964114 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967193 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967258 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967362 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.970547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.147031 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.147058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.252696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.255003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.255033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.255847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.257424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.266320 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" path="/var/lib/kubelet/pods/67dd868e-24f8-426f-b835-1e92ab4441e6/volumes" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.266957 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" path="/var/lib/kubelet/pods/e756aad3-09ee-4c1c-b495-7417339f50e5/volumes" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.267469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.285633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:11 crc kubenswrapper[4795]: I0320 17:38:11.300244 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:38:11 crc kubenswrapper[4795]: I0320 17:38:11.300541 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:38:14 crc kubenswrapper[4795]: I0320 17:38:14.028927 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 17:38:15 crc kubenswrapper[4795]: I0320 17:38:15.844682 4795 scope.go:117] "RemoveContainer" containerID="8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46" Mar 20 17:38:16 crc kubenswrapper[4795]: E0320 17:38:16.361404 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 17:38:16 crc kubenswrapper[4795]: E0320 17:38:16.361946 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-692nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-4t68k_openstack(d254abd5-b344-416a-b99d-96737388795e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:38:16 crc kubenswrapper[4795]: E0320 17:38:16.363136 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-4t68k" podUID="d254abd5-b344-416a-b99d-96737388795e" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.497254 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.504239 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588932 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589014 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589170 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.590364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data" (OuterVolumeSpecName: "config-data") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.590860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts" (OuterVolumeSpecName: "scripts") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.590989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs" (OuterVolumeSpecName: "logs") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.595132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.596087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k" (OuterVolumeSpecName: "kube-api-access-fdk4k") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "kube-api-access-fdk4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.595991 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf" (OuterVolumeSpecName: "kube-api-access-xxcnf") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "kube-api-access-xxcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.635928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config" (OuterVolumeSpecName: "config") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.637486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.640476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.640634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.642494 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690451 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690491 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690503 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690514 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690526 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690538 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690549 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690561 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690572 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690583 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690593 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.201095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerDied","Data":"4bbbefbb238ecd4d186023f5577494b794cb2bb3b7f5edef795d647b067a660b"} Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.201207 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.203829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.208012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f84bdc6f9-rj454" event={"ID":"c2e1702a-166c-4c2e-9c39-d32a62528a89","Type":"ContainerDied","Data":"49b0337f706e9262134a18e9a4347becf3d1ef94349c1c8c94395df30d906dd2"} Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.209878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-4t68k" podUID="d254abd5-b344-416a-b99d-96737388795e" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.288900 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.297143 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.305139 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.311985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.625564 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.625994 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4285l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rdxps_openstack(706c47a0-7763-44af-9b14-0e5322a8f2f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.627449 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rdxps" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.630222 4795 scope.go:117] "RemoveContainer" containerID="9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.870952 4795 scope.go:117] "RemoveContainer" containerID="17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.903851 4795 scope.go:117] "RemoveContainer" containerID="46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.936209 4795 scope.go:117] "RemoveContainer" containerID="99144a28a69d3c6fb3d096bda447f3ff1d42028233b58205725db4c507c27464" Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.165304 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3bb3cb2_7e6a_4c4d_9cb9_cd8d6683c109.slice/crio-768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5 WatchSource:0}: Error finding container 768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5: Status 404 returned error can't find the container with id 768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.171517 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb74ddb8-dbrvh"] Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.217131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerStarted","Data":"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.217173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerStarted","Data":"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.217286 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-777644b489-7th7n" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" containerID="cri-o://fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.218810 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-777644b489-7th7n" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" containerID="cri-o://8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.232368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.233963 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb74ddb8-dbrvh" event={"ID":"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109","Type":"ContainerStarted","Data":"768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.240822 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerStarted","Data":"8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.243875 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-777644b489-7th7n" podStartSLOduration=3.152344978 podStartE2EDuration="28.243858442s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.301522343 +0000 UTC m=+1214.759553884" lastFinishedPulling="2026-03-20 17:38:16.393035807 +0000 UTC m=+1239.851067348" observedRunningTime="2026-03-20 17:38:18.23899012 +0000 UTC m=+1241.697021661" watchObservedRunningTime="2026-03-20 17:38:18.243858442 +0000 UTC m=+1241.701889983" Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.251517 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fcd89d897-nsn69" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" containerID="cri-o://71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.251951 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerStarted","Data":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.252072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerStarted","Data":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.253068 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fcd89d897-nsn69" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" containerID="cri-o://109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: E0320 17:38:18.258364 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-rdxps" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.290007 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nfr5n" podStartSLOduration=3.638139245 podStartE2EDuration="28.289990702s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.741213771 +0000 UTC m=+1215.199245312" lastFinishedPulling="2026-03-20 17:38:16.393065208 +0000 UTC m=+1239.851096769" observedRunningTime="2026-03-20 17:38:18.258021977 +0000 UTC m=+1241.716053508" watchObservedRunningTime="2026-03-20 17:38:18.289990702 +0000 UTC m=+1241.748022243" Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.298488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.318803 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.318821 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4244f6d6_536a_4555_a05b_176d696d427d.slice/crio-f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6 WatchSource:0}: Error finding container f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6: Status 404 returned error can't find the container with id f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.319261 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-fcd89d897-nsn69" podStartSLOduration=1.812779688 podStartE2EDuration="26.319252191s" podCreationTimestamp="2026-03-20 17:37:52 +0000 UTC" firstStartedPulling="2026-03-20 17:37:53.184941894 +0000 UTC m=+1216.642973435" lastFinishedPulling="2026-03-20 17:38:17.691414377 +0000 UTC m=+1241.149445938" observedRunningTime="2026-03-20 17:38:18.30810472 +0000 UTC m=+1241.766136261" watchObservedRunningTime="2026-03-20 17:38:18.319252191 +0000 UTC m=+1241.777283732" Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.321812 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3e822b2_0b57_4f89_ab29_caeb483457a1.slice/crio-b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4 WatchSource:0}: Error finding container b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4: Status 404 returned error can't find the container with id b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.335180 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.494708 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.501708 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e24d4d3_23ba_4ab0_a5af_3a6dfc19c197.slice/crio-763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c WatchSource:0}: Error finding container 763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c: Status 404 returned error can't find the container with id 763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.033612 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.242144 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.266160 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e1702a-166c-4c2e-9c39-d32a62528a89" path="/var/lib/kubelet/pods/c2e1702a-166c-4c2e-9c39-d32a62528a89/volumes" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.267155 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" path="/var/lib/kubelet/pods/ed82521a-8a78-4611-870f-5ad53625bddf/volumes" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.271873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerStarted","Data":"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.271921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerStarted","Data":"763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.305012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb74ddb8-dbrvh" event={"ID":"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109","Type":"ContainerStarted","Data":"d39fafb68e4cc0a60f4946e801afe6e5a5c3ec76ae4ec0eb5f1dba6501b35e42"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.305068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb74ddb8-dbrvh" event={"ID":"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109","Type":"ContainerStarted","Data":"f4bf1c555c928a4c546cc74f9c927811c7ebd6235271eebb506ed1f67527a50b"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.330387 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7fb74ddb8-dbrvh" podStartSLOduration=21.330370076 podStartE2EDuration="21.330370076s" podCreationTimestamp="2026-03-20 17:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:19.324039897 +0000 UTC m=+1242.782071438" watchObservedRunningTime="2026-03-20 17:38:19.330370076 +0000 UTC m=+1242.788401607" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.346370 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerStarted","Data":"8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.346410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerStarted","Data":"f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.366965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerStarted","Data":"5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.367012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerStarted","Data":"da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.367022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerStarted","Data":"b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.374474 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-7flct" event={"ID":"e83d2a1a-2b3b-409a-997a-672e322b1d8e","Type":"ContainerStarted","Data":"d0709eba067851036b3ccf0f38eb78dfb3069d88982f3378db1881ed2de27d68"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.377751 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qfpzw" podStartSLOduration=11.377726744 podStartE2EDuration="11.377726744s" podCreationTimestamp="2026-03-20 17:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:19.360804402 +0000 UTC m=+1242.818835963" watchObservedRunningTime="2026-03-20 17:38:19.377726744 +0000 UTC m=+1242.835758285" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.390939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-698b6ff5c8-7p5rs" podStartSLOduration=21.390920528 podStartE2EDuration="21.390920528s" podCreationTimestamp="2026-03-20 17:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:19.38845065 +0000 UTC m=+1242.846482221" watchObservedRunningTime="2026-03-20 17:38:19.390920528 +0000 UTC m=+1242.848952089" Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.406164 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerStarted","Data":"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619"} Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.411256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerStarted","Data":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.411345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerStarted","Data":"0cc2eb8ef99525fb1f871b8b8fb5220a96df925d928238b2cb6ec98c0cfc670e"} Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.442614 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.442592768 podStartE2EDuration="12.442592768s" podCreationTimestamp="2026-03-20 17:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:20.435196035 +0000 UTC m=+1243.893227596" watchObservedRunningTime="2026-03-20 17:38:20.442592768 +0000 UTC m=+1243.900624319" Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.521482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:38:22 crc kubenswrapper[4795]: I0320 17:38:22.521424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.480306 4795 generic.go:334] "Generic (PLEG): container finished" podID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerID="27cb2cc4ca0cf03af5e4f56a72a8901b4a28c70c5abb54e1f86d55c8053dcc74" exitCode=0 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.480362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-7flct" event={"ID":"e83d2a1a-2b3b-409a-997a-672e322b1d8e","Type":"ContainerDied","Data":"27cb2cc4ca0cf03af5e4f56a72a8901b4a28c70c5abb54e1f86d55c8053dcc74"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.483284 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerStarted","Data":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.483365 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" containerID="cri-o://ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" gracePeriod=30 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.483382 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" containerID="cri-o://f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" gracePeriod=30 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.487681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.497076 4795 generic.go:334] "Generic (PLEG): container finished" podID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerID="8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1" exitCode=0 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.497156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerDied","Data":"8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.501976 4795 generic.go:334] "Generic (PLEG): container finished" podID="4244f6d6-536a-4555-a05b-176d696d427d" containerID="8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13" exitCode=0 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.502021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerDied","Data":"8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.515533 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=25.515515166 podStartE2EDuration="25.515515166s" podCreationTimestamp="2026-03-20 17:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:23.510163218 +0000 UTC m=+1246.968194759" watchObservedRunningTime="2026-03-20 17:38:23.515515166 +0000 UTC m=+1246.973546707" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.202151 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.232756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.232841 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.232938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233150 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.234232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs" (OuterVolumeSpecName: "logs") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.234629 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.234679 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.244086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts" (OuterVolumeSpecName: "scripts") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.246891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r" (OuterVolumeSpecName: "kube-api-access-b5q4r") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "kube-api-access-b5q4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.267932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.292979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.296397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.301802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data" (OuterVolumeSpecName: "config-data") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337364 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337420 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337454 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337467 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337479 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.365633 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.439803 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512531 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" exitCode=0 Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512563 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" exitCode=143 Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerDied","Data":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512665 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512698 4795 scope.go:117] "RemoveContainer" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerDied","Data":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerDied","Data":"0cc2eb8ef99525fb1f871b8b8fb5220a96df925d928238b2cb6ec98c0cfc670e"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.516487 4795 generic.go:334] "Generic (PLEG): container finished" podID="37537245-d57e-4087-ade6-6c028eb4d137" containerID="43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a" exitCode=0 Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.516713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerDied","Data":"43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.546839 4795 scope.go:117] "RemoveContainer" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.583691 4795 scope.go:117] "RemoveContainer" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.587854 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": container with ID starting with f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f not found: ID does not exist" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.587902 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} err="failed to get container status \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": rpc error: code = NotFound desc = could not find container \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": container with ID starting with f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.587927 4795 scope.go:117] "RemoveContainer" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.589553 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": container with ID starting with ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6 not found: ID does not exist" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.589618 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} err="failed to get container status \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": rpc error: code = NotFound desc = could not find container \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": container with ID starting with ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6 not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.589645 4795 scope.go:117] "RemoveContainer" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.597832 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} err="failed to get container status \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": rpc error: code = NotFound desc = could not find container \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": container with ID starting with f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.597876 4795 scope.go:117] "RemoveContainer" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.599713 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} err="failed to get container status \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": rpc error: code = NotFound desc = could not find container \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": container with ID starting with ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6 not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.619693 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.625746 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.637820 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638292 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638308 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638331 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638339 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638361 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638381 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638403 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="init" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638412 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="init" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638601 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638644 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638671 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.639844 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.645590 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.645898 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.685088 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748273 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849683 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.851953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.852573 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.853165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.858047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.859483 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.860074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.867365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.868486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.901142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.977928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:25 crc kubenswrapper[4795]: I0320 17:38:25.262216 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" path="/var/lib/kubelet/pods/1e38733a-b81f-4fc5-9ef5-22e14c513263/volumes" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.152930 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186404 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186757 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.196201 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n" (OuterVolumeSpecName: "kube-api-access-g2c2n") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "kube-api-access-g2c2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.196219 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts" (OuterVolumeSpecName: "scripts") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.196608 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.212940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.215385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data" (OuterVolumeSpecName: "config-data") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.232921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288770 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288812 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288825 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288839 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288851 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288862 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.546751 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerDied","Data":"f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6"} Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.546791 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.546844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.353286 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85b996ff68-fdzxg"] Mar 20 17:38:27 crc kubenswrapper[4795]: E0320 17:38:27.353721 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4244f6d6-536a-4555-a05b-176d696d427d" containerName="keystone-bootstrap" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.353734 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4244f6d6-536a-4555-a05b-176d696d427d" containerName="keystone-bootstrap" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.353906 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4244f6d6-536a-4555-a05b-176d696d427d" containerName="keystone-bootstrap" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.354502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.358128 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.358291 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.358461 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.360260 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.360422 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.360567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.384602 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85b996ff68-fdzxg"] Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.433994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-credential-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-config-data\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-combined-ca-bundle\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-public-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434164 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-scripts\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434198 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flrft\" (UniqueName: \"kubernetes.io/projected/7b20a034-11f6-40ad-9447-32c49f705c07-kube-api-access-flrft\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-fernet-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-internal-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.535933 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-fernet-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-internal-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-credential-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-config-data\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-combined-ca-bundle\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-public-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-scripts\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flrft\" (UniqueName: \"kubernetes.io/projected/7b20a034-11f6-40ad-9447-32c49f705c07-kube-api-access-flrft\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.541801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-internal-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.542143 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-scripts\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.542494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-public-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.543520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-fernet-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.544542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-credential-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.545773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-config-data\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.555648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-combined-ca-bundle\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.567345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flrft\" (UniqueName: \"kubernetes.io/projected/7b20a034-11f6-40ad-9447-32c49f705c07-kube-api-access-flrft\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.672878 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.525329 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.525394 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.560555 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.574247 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.600716 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.600900 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.778036 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.778404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.779270 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.806140 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.807890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.836808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860703 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860773 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.868067 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs" (OuterVolumeSpecName: "logs") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.908661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg" (OuterVolumeSpecName: "kube-api-access-dxdsg") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "kube-api-access-dxdsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.908719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s" (OuterVolumeSpecName: "kube-api-access-ff42s") pod "e83d2a1a-2b3b-409a-997a-672e322b1d8e" (UID: "e83d2a1a-2b3b-409a-997a-672e322b1d8e"). InnerVolumeSpecName "kube-api-access-ff42s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.908749 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.909039 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.921910 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb74ddb8-dbrvh" podUID="f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.922676 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.926090 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data" (OuterVolumeSpecName: "config-data") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.927905 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts" (OuterVolumeSpecName: "scripts") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.962409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"37537245-d57e-4087-ade6-6c028eb4d137\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.962503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"37537245-d57e-4087-ade6-6c028eb4d137\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.962541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"37537245-d57e-4087-ade6-6c028eb4d137\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963028 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963054 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963070 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963082 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963094 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963105 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.966177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx" (OuterVolumeSpecName: "kube-api-access-868zx") pod "37537245-d57e-4087-ade6-6c028eb4d137" (UID: "37537245-d57e-4087-ade6-6c028eb4d137"). InnerVolumeSpecName "kube-api-access-868zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.005492 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config" (OuterVolumeSpecName: "config") pod "37537245-d57e-4087-ade6-6c028eb4d137" (UID: "37537245-d57e-4087-ade6-6c028eb4d137"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.015803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37537245-d57e-4087-ade6-6c028eb4d137" (UID: "37537245-d57e-4087-ade6-6c028eb4d137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.064946 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.064974 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.064989 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.275635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85b996ff68-fdzxg"] Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.388405 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.619908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.639866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85b996ff68-fdzxg" event={"ID":"7b20a034-11f6-40ad-9447-32c49f705c07","Type":"ContainerStarted","Data":"17fb2772904e9cb1a712ecf983534dfaadc0d44555105dfe0904f9c295f5ae49"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.647620 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerDied","Data":"88a328785f37d83e7d6391b28b27d2d2c6fdbb3c3985b1505819228d225ea6fa"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.647656 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a328785f37d83e7d6391b28b27d2d2c6fdbb3c3985b1505819228d225ea6fa" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.647732 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.649746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerDied","Data":"c771cb8aaa4f06cd374656dadf93993f0cd60baafd23685f3873cdc0a25a81b2"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.649768 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c771cb8aaa4f06cd374656dadf93993f0cd60baafd23685f3873cdc0a25a81b2" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.650597 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.660754 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.661191 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-7flct" event={"ID":"e83d2a1a-2b3b-409a-997a-672e322b1d8e","Type":"ContainerDied","Data":"d0709eba067851036b3ccf0f38eb78dfb3069d88982f3378db1881ed2de27d68"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.661243 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0709eba067851036b3ccf0f38eb78dfb3069d88982f3378db1881ed2de27d68" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.663010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerStarted","Data":"79719142974a75aa1ceb9ca03ec61b98a42d47f6e27982f5c5a5e0502981ad81"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.936743 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.961239 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061373 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fc784f9bb-wjct6"] Mar 20 17:38:30 crc kubenswrapper[4795]: E0320 17:38:30.061747 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerName="placement-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061767 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerName="placement-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: E0320 17:38:30.061785 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerName="oc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061793 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerName="oc" Mar 20 17:38:30 crc kubenswrapper[4795]: E0320 17:38:30.061820 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37537245-d57e-4087-ade6-6c028eb4d137" containerName="neutron-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061829 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="37537245-d57e-4087-ade6-6c028eb4d137" containerName="neutron-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.062091 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="37537245-d57e-4087-ade6-6c028eb4d137" containerName="neutron-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.062111 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerName="oc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.062130 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerName="placement-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.083561 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.103543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.104286 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qdq8q" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.104431 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.104561 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.108714 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.110580 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc784f9bb-wjct6"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.199987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48841a5b-142c-49d0-8e87-8562f8d1f824-logs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9765z\" (UniqueName: \"kubernetes.io/projected/48841a5b-142c-49d0-8e87-8562f8d1f824-kube-api-access-9765z\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-internal-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-config-data\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-public-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-combined-ca-bundle\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200164 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-scripts\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48841a5b-142c-49d0-8e87-8562f8d1f824-logs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9765z\" (UniqueName: \"kubernetes.io/projected/48841a5b-142c-49d0-8e87-8562f8d1f824-kube-api-access-9765z\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302415 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-internal-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-config-data\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-public-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-combined-ca-bundle\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-scripts\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.308306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48841a5b-142c-49d0-8e87-8562f8d1f824-logs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.308818 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-internal-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.309959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-public-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.311826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-scripts\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.318207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-config-data\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.318249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-combined-ca-bundle\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.349769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9765z\" (UniqueName: \"kubernetes.io/projected/48841a5b-142c-49d0-8e87-8562f8d1f824-kube-api-access-9765z\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.365745 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.367119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.406382 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.408946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.408980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.426065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.510688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.510951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511066 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.533990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.540310 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.541638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.544899 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.545171 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.545679 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.545815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qbkvb" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.552218 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.692180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerStarted","Data":"20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf"} Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.694840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerStarted","Data":"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4"} Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.695905 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.695920 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.696826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85b996ff68-fdzxg" event={"ID":"7b20a034-11f6-40ad-9447-32c49f705c07","Type":"ContainerStarted","Data":"01fce90f9912505f0c63c2440dbfb5a9d7dbc1b947372ef3c1ec663fec74a640"} Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.696846 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721161 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.722481 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-4t68k" podStartSLOduration=2.6786024790000003 podStartE2EDuration="40.722458209s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.741583902 +0000 UTC m=+1215.199615443" lastFinishedPulling="2026-03-20 17:38:29.785439632 +0000 UTC m=+1253.243471173" observedRunningTime="2026-03-20 17:38:30.707631183 +0000 UTC m=+1254.165662734" watchObservedRunningTime="2026-03-20 17:38:30.722458209 +0000 UTC m=+1254.180489750" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.731338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.737291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.739600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.750369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.756011 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85b996ff68-fdzxg" podStartSLOduration=3.7559947129999998 podStartE2EDuration="3.755994713s" podCreationTimestamp="2026-03-20 17:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:30.745149032 +0000 UTC m=+1254.203180573" watchObservedRunningTime="2026-03-20 17:38:30.755994713 +0000 UTC m=+1254.214026254" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.839680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.899073 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.223173 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.270665 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" path="/var/lib/kubelet/pods/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd/volumes" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.328459 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc784f9bb-wjct6"] Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.458934 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.462115 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.670340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.716102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc784f9bb-wjct6" event={"ID":"48841a5b-142c-49d0-8e87-8562f8d1f824","Type":"ContainerStarted","Data":"d07d6dd3cc2db4cdd1777a7fc98fdc1ca644aa4e69a46449d4895e63be3fa501"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.718939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc784f9bb-wjct6" event={"ID":"48841a5b-142c-49d0-8e87-8562f8d1f824","Type":"ContainerStarted","Data":"faeae9f21a956a7e66e45503fb180190dd7762567a0ecca500272e336de66482"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.736351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerStarted","Data":"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.809477 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.8094548580000005 podStartE2EDuration="7.809454858s" podCreationTimestamp="2026-03-20 17:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:31.771392092 +0000 UTC m=+1255.229423633" watchObservedRunningTime="2026-03-20 17:38:31.809454858 +0000 UTC m=+1255.267486399" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.815331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerStarted","Data":"b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.834090 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" exitCode=0 Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.836533 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerDied","Data":"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.836573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerStarted","Data":"af377878f795441c98f680067ea533216f89a8059101023b972799ab26727a8a"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.872213 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rdxps" podStartSLOduration=3.5542913 podStartE2EDuration="41.872193969s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.741348915 +0000 UTC m=+1215.199380456" lastFinishedPulling="2026-03-20 17:38:30.059251584 +0000 UTC m=+1253.517283125" observedRunningTime="2026-03-20 17:38:31.863678031 +0000 UTC m=+1255.321709592" watchObservedRunningTime="2026-03-20 17:38:31.872193969 +0000 UTC m=+1255.330225510" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.863794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerStarted","Data":"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.864201 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.876461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc784f9bb-wjct6" event={"ID":"48841a5b-142c-49d0-8e87-8562f8d1f824","Type":"ContainerStarted","Data":"35559ff3a8ad2661db60328fffdb89879fc735d0d8b68e088b90d8af8feccac4"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.878515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.878725 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.897278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerStarted","Data":"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.897333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerStarted","Data":"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.897345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerStarted","Data":"6b6c595fe74467a83b78631dcaec9938772f82ba49da938ae37e739e51dd0a38"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.898228 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" podStartSLOduration=2.8982111120000003 podStartE2EDuration="2.898211112s" podCreationTimestamp="2026-03-20 17:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:32.890947064 +0000 UTC m=+1256.348978605" watchObservedRunningTime="2026-03-20 17:38:32.898211112 +0000 UTC m=+1256.356242653" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.898291 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-649db44647-mrjns"] Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.903267 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.907187 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.907266 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.945908 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-649db44647-mrjns"] Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.947296 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fc784f9bb-wjct6" podStartSLOduration=2.947280483 podStartE2EDuration="2.947280483s" podCreationTimestamp="2026-03-20 17:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:32.918381175 +0000 UTC m=+1256.376412716" watchObservedRunningTime="2026-03-20 17:38:32.947280483 +0000 UTC m=+1256.405312024" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.974011 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-575df674dd-5xp2t" podStartSLOduration=2.973991533 podStartE2EDuration="2.973991533s" podCreationTimestamp="2026-03-20 17:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:32.961373166 +0000 UTC m=+1256.419404937" watchObservedRunningTime="2026-03-20 17:38:32.973991533 +0000 UTC m=+1256.432023074" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-public-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-internal-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-ovndb-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkjf\" (UniqueName: \"kubernetes.io/projected/5a472785-4467-4c97-93b9-e6f6eff19126-kube-api-access-jjkjf\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082954 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-httpd-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-combined-ca-bundle\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-internal-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-ovndb-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkjf\" (UniqueName: \"kubernetes.io/projected/5a472785-4467-4c97-93b9-e6f6eff19126-kube-api-access-jjkjf\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-httpd-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-combined-ca-bundle\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-public-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.197672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-combined-ca-bundle\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.199966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-internal-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.200377 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-ovndb-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.200701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.201641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-public-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.205828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-httpd-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.208248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkjf\" (UniqueName: \"kubernetes.io/projected/5a472785-4467-4c97-93b9-e6f6eff19126-kube-api-access-jjkjf\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.233564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.804558 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-649db44647-mrjns"] Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.921972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649db44647-mrjns" event={"ID":"5a472785-4467-4c97-93b9-e6f6eff19126","Type":"ContainerStarted","Data":"f17f708ce8f6f9465a4b59b4af8b0853d0f9e2263ad31d5a00163b96eb3c8597"} Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.922524 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.935945 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649db44647-mrjns" event={"ID":"5a472785-4467-4c97-93b9-e6f6eff19126","Type":"ContainerStarted","Data":"77223825f1ef15c8abf7e03b971e41ccc6a4baeed09d05bd269cf50328a9cf4c"} Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.936313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649db44647-mrjns" event={"ID":"5a472785-4467-4c97-93b9-e6f6eff19126","Type":"ContainerStarted","Data":"4326e2335a5d3a14f50bdd5288042c5e10959366a9d9580c26859376549fdce0"} Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.961655 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-649db44647-mrjns" podStartSLOduration=2.961636737 podStartE2EDuration="2.961636737s" podCreationTimestamp="2026-03-20 17:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:34.953013646 +0000 UTC m=+1258.411045207" watchObservedRunningTime="2026-03-20 17:38:34.961636737 +0000 UTC m=+1258.419668278" Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.979393 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.979440 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.016175 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.026451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.948304 4795 generic.go:334] "Generic (PLEG): container finished" podID="d254abd5-b344-416a-b99d-96737388795e" containerID="20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf" exitCode=0 Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.948416 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerDied","Data":"20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf"} Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.950028 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.950079 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.950093 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:38:37 crc kubenswrapper[4795]: I0320 17:38:37.755586 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:38:37 crc kubenswrapper[4795]: I0320 17:38:37.759328 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.777723 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.903003 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb74ddb8-dbrvh" podUID="f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.987528 4795 generic.go:334] "Generic (PLEG): container finished" podID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerID="b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78" exitCode=0 Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.987618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerDied","Data":"b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78"} Mar 20 17:38:40 crc kubenswrapper[4795]: I0320 17:38:40.726836 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:40 crc kubenswrapper[4795]: I0320 17:38:40.878345 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:38:40 crc kubenswrapper[4795]: I0320 17:38:40.879035 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" containerID="cri-o://fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18" gracePeriod=10 Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.300463 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.300529 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.300578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.301369 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.301431 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd" gracePeriod=600 Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.808567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.819282 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940642 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940753 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"d254abd5-b344-416a-b99d-96737388795e\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"d254abd5-b344-416a-b99d-96737388795e\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"d254abd5-b344-416a-b99d-96737388795e\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.942218 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.947010 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d254abd5-b344-416a-b99d-96737388795e" (UID: "d254abd5-b344-416a-b99d-96737388795e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.947044 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts" (OuterVolumeSpecName: "scripts") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.952093 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl" (OuterVolumeSpecName: "kube-api-access-692nl") pod "d254abd5-b344-416a-b99d-96737388795e" (UID: "d254abd5-b344-416a-b99d-96737388795e"). InnerVolumeSpecName "kube-api-access-692nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.952164 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.952412 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l" (OuterVolumeSpecName: "kube-api-access-4285l") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "kube-api-access-4285l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.968645 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d254abd5-b344-416a-b99d-96737388795e" (UID: "d254abd5-b344-416a-b99d-96737388795e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.969675 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.014971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data" (OuterVolumeSpecName: "config-data") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.015597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerDied","Data":"eaa9eee2e882516c5d4ae5df7684d52bf42c7eec92e061674b1b8ad393538f60"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.015646 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaa9eee2e882516c5d4ae5df7684d52bf42c7eec92e061674b1b8ad393538f60" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.015612 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.017995 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd" exitCode=0 Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.018070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.018114 4795 scope.go:117] "RemoveContainer" containerID="f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.023499 4795 generic.go:334] "Generic (PLEG): container finished" podID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerID="fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18" exitCode=0 Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.023575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerDied","Data":"fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.026439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerDied","Data":"d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.026472 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.026521 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044657 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044721 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044732 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044741 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044757 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044771 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044785 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044797 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.317177 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.451849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452326 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452394 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452469 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.456195 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f" (OuterVolumeSpecName: "kube-api-access-9j89f") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "kube-api-access-9j89f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.506942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.510259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config" (OuterVolumeSpecName: "config") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.512372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: E0320 17:38:42.531305 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0 podName:f79c1ee6-f8b4-485c-ac9e-667a09868206 nodeName:}" failed. No retries permitted until 2026-03-20 17:38:43.031280124 +0000 UTC m=+1266.489311665 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206") : error deleting /var/lib/kubelet/pods/f79c1ee6-f8b4-485c-ac9e-667a09868206/volume-subpaths: remove /var/lib/kubelet/pods/f79c1ee6-f8b4-485c-ac9e-667a09868206/volume-subpaths: no such file or directory Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.531558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555666 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555716 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555730 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555741 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555750 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.048744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1"} Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.063743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.064856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.081079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerDied","Data":"f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1"} Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.081135 4795 scope.go:117] "RemoveContainer" containerID="fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.081258 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76977cb5bb-84w8l"] Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125879 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125891 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125905 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="init" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125912 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="init" Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125938 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerName="cinder-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125944 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerName="cinder-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125955 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d254abd5-b344-416a-b99d-96737388795e" containerName="barbican-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125960 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d254abd5-b344-416a-b99d-96737388795e" containerName="barbican-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.126129 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.126141 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d254abd5-b344-416a-b99d-96737388795e" containerName="barbican-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.126159 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerName="cinder-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.133310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.140196 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qhmpx" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.140519 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.141504 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.149967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220"} Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150168 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" containerID="cri-o://3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150316 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150354 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" containerID="cri-o://7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150390 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" containerID="cri-o://845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150420 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" containerID="cri-o://5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.170093 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-558cc4f6c9-d6wp7"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.171409 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.176796 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.193366 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.194362 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76977cb5bb-84w8l"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.207732 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.252039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-558cc4f6c9-d6wp7"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.288987 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.648669325 podStartE2EDuration="53.288966437s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.790881467 +0000 UTC m=+1215.248913008" lastFinishedPulling="2026-03-20 17:38:42.431178569 +0000 UTC m=+1265.889210120" observedRunningTime="2026-03-20 17:38:43.183670199 +0000 UTC m=+1266.641701730" watchObservedRunningTime="2026-03-20 17:38:43.288966437 +0000 UTC m=+1266.746997978" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-logs\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data-custom\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa8c15c-b759-4db8-ac4d-28648a8cfde2-logs\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data-custom\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-combined-ca-bundle\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294544 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-combined-ca-bundle\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294590 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rfn\" (UniqueName: \"kubernetes.io/projected/faa8c15c-b759-4db8-ac4d-28648a8cfde2-kube-api-access-j4rfn\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sxmm\" (UniqueName: \"kubernetes.io/projected/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-kube-api-access-6sxmm\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.335234 4795 scope.go:117] "RemoveContainer" containerID="65673e010192e9af6a054b2e6fafb5d1f1505b377d27e64bdbfd06c2c8d1a1c2" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.340275 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-logs\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data-custom\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406873 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa8c15c-b759-4db8-ac4d-28648a8cfde2-logs\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data-custom\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406938 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-combined-ca-bundle\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407013 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-combined-ca-bundle\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rfn\" (UniqueName: \"kubernetes.io/projected/faa8c15c-b759-4db8-ac4d-28648a8cfde2-kube-api-access-j4rfn\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sxmm\" (UniqueName: \"kubernetes.io/projected/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-kube-api-access-6sxmm\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-logs\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.410431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa8c15c-b759-4db8-ac4d-28648a8cfde2-logs\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.426036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-combined-ca-bundle\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.426309 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.426618 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-combined-ca-bundle\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.435772 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.437346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.439345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data-custom\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.439884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data-custom\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.443626 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rfn\" (UniqueName: \"kubernetes.io/projected/faa8c15c-b759-4db8-ac4d-28648a8cfde2-kube-api-access-j4rfn\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.455324 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.457556 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.459159 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.460989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sxmm\" (UniqueName: \"kubernetes.io/projected/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-kube-api-access-6sxmm\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.464989 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.468829 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.468932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.468840 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m5c4m" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.478028 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.483301 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.505394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.515783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.515901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.515943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.528548 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.530830 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.532217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.534506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.554725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.569143 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.569877 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-g4tvg ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" podUID="b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.584699 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.586662 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.617800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618768 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619153 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619213 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621477 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.626017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.626303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.626772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.627019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.627398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.637783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.638389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.638697 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.639280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.640664 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.644048 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.644218 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.649532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.654354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.726893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727500 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727853 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728519 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.729524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.732877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.733140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.733672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.734068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.734620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.735128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.741896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.753324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.756153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.759875 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830452 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.831557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.831793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.833496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.834943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.839204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.850329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.859772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.862327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.885776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.941137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.010535 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.082302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76977cb5bb-84w8l"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.108169 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.184677 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-558cc4f6c9-d6wp7"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.185992 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220" exitCode=0 Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186024 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d" exitCode=2 Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186044 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3" exitCode=0 Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.188478 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.188953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" event={"ID":"faa8c15c-b759-4db8-ac4d-28648a8cfde2","Type":"ContainerStarted","Data":"8f3d7a20354df7d18fc487eec759baf1acbab93bbcbc536df420db590d468521"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.203993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236407 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236542 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236648 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.237492 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.238029 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config" (OuterVolumeSpecName: "config") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.238333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.239343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.240224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.242586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg" (OuterVolumeSpecName: "kube-api-access-g4tvg") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "kube-api-access-g4tvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338183 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338210 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338219 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338228 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338236 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338243 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.352889 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.360948 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.514523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.600730 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.202562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerStarted","Data":"c8191878b7b3ac40636daf5acafd14c09be03b0e522e52658ab845e1056e98aa"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.205173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerStarted","Data":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.205218 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerStarted","Data":"0867819d1c6037d3f5c50bda529eb1c4bf953092deb0683fb9110c469d1d3a7a"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.207088 4795 generic.go:334] "Generic (PLEG): container finished" podID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" exitCode=0 Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.207207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerDied","Data":"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.207232 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerStarted","Data":"d7ccdefaaa93b0e48b444bfb331ca4591ab4806568e7a9f1ee5df6eaa4ff29c6"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerStarted","Data":"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210162 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerStarted","Data":"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerStarted","Data":"57c443aca3511d65cb1758f7e520aeb9b66168af6d8e810d947b3d385977aa2d"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.214206 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.214460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" event={"ID":"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea","Type":"ContainerStarted","Data":"aa03efdcdb80e6512a64887c9fc6a68c832266c81fee6bc6488136ea3d701040"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.283968 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6df8664bf8-htftz" podStartSLOduration=2.283946131 podStartE2EDuration="2.283946131s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:45.247631931 +0000 UTC m=+1268.705663472" watchObservedRunningTime="2026-03-20 17:38:45.283946131 +0000 UTC m=+1268.741977672" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.297085 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" path="/var/lib/kubelet/pods/f79c1ee6-f8b4-485c-ac9e-667a09868206/volumes" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.346539 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.420233 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.427911 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.238781 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69" exitCode=0 Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.238854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69"} Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241602 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" containerID="cri-o://10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" gracePeriod=30 Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241647 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" containerID="cri-o://a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" gracePeriod=30 Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerStarted","Data":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241955 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.244895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerStarted","Data":"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93"} Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.245418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.268829 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.268809841 podStartE2EDuration="3.268809841s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:46.257114284 +0000 UTC m=+1269.715145825" watchObservedRunningTime="2026-03-20 17:38:46.268809841 +0000 UTC m=+1269.726841382" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.281863 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" podStartSLOduration=3.281848591 podStartE2EDuration="3.281848591s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:46.279427255 +0000 UTC m=+1269.737458796" watchObservedRunningTime="2026-03-20 17:38:46.281848591 +0000 UTC m=+1269.739880132" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.395222 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484182 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.485305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.485669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.492041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h" (OuterVolumeSpecName: "kube-api-access-zz44h") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "kube-api-access-zz44h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.492160 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts" (OuterVolumeSpecName: "scripts") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.518320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.582324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587456 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587502 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587516 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587525 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587533 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587542 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.625637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data" (OuterVolumeSpecName: "config-data") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.688607 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.127637 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197300 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197469 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.200646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs" (OuterVolumeSpecName: "logs") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.200778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.206223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.206262 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts" (OuterVolumeSpecName: "scripts") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.206696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj" (OuterVolumeSpecName: "kube-api-access-5xzbj") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "kube-api-access-5xzbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.284932 4795 generic.go:334] "Generic (PLEG): container finished" podID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" exitCode=0 Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.284967 4795 generic.go:334] "Generic (PLEG): container finished" podID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" exitCode=143 Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.285094 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.287663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.298420 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" path="/var/lib/kubelet/pods/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d/volumes" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299536 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299609 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299630 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299640 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299648 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299658 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299666 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerDied","Data":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerDied","Data":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerDied","Data":"0867819d1c6037d3f5c50bda529eb1c4bf953092deb0683fb9110c469d1d3a7a"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" event={"ID":"faa8c15c-b759-4db8-ac4d-28648a8cfde2","Type":"ContainerStarted","Data":"56604dff2798710557b3f53431abf9d5766ec2e2e221efea7bc947c8bdd3f969"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" event={"ID":"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea","Type":"ContainerStarted","Data":"8942e26425357dd66aec41f104906e52c4f17443af2003e51e6111a96df8eee3"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" event={"ID":"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea","Type":"ContainerStarted","Data":"ff561ffcc67744b158c99814015cb22e1d7cf4e90c3b0b0118d0d52bff65d9f0"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307474 4795 scope.go:117] "RemoveContainer" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"d2d85da431f6c738cb4aa9ee890ff4d70deedc277e0a8410951bda0e019d69a8"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.342093 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data" (OuterVolumeSpecName: "config-data") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.369831 4795 scope.go:117] "RemoveContainer" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.378665 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.389996 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.403727 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.409313 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" podStartSLOduration=2.5117347370000003 podStartE2EDuration="4.40929592s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="2026-03-20 17:38:44.196752416 +0000 UTC m=+1267.654783957" lastFinishedPulling="2026-03-20 17:38:46.094313599 +0000 UTC m=+1269.552345140" observedRunningTime="2026-03-20 17:38:47.376105138 +0000 UTC m=+1270.834136689" watchObservedRunningTime="2026-03-20 17:38:47.40929592 +0000 UTC m=+1270.867327461" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.412766 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.413466 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.413493 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.413512 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.413521 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416187 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416208 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416233 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416256 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416263 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416300 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416593 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416634 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416644 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416656 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.418612 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.419609 4795 scope.go:117] "RemoveContainer" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.421069 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.420903 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": container with ID starting with a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c not found: ID does not exist" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.421196 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} err="failed to get container status \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": rpc error: code = NotFound desc = could not find container \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": container with ID starting with a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.421240 4795 scope.go:117] "RemoveContainer" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.423469 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": container with ID starting with 10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17 not found: ID does not exist" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.423503 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} err="failed to get container status \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": rpc error: code = NotFound desc = could not find container \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": container with ID starting with 10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17 not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.423540 4795 scope.go:117] "RemoveContainer" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.423791 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.433178 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} err="failed to get container status \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": rpc error: code = NotFound desc = could not find container \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": container with ID starting with a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.433225 4795 scope.go:117] "RemoveContainer" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.438326 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} err="failed to get container status \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": rpc error: code = NotFound desc = could not find container \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": container with ID starting with 10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17 not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.438368 4795 scope.go:117] "RemoveContainer" containerID="7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.453230 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.466637 4795 scope.go:117] "RemoveContainer" containerID="845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504780 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504923 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504939 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.511048 4795 scope.go:117] "RemoveContainer" containerID="5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.544885 4795 scope.go:117] "RemoveContainer" containerID="3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.609521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.609777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.614385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.614507 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.618557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.619175 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.631551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.646565 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.660303 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.677195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.678669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.681553 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.682911 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.684353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.687224 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2g97\" (UniqueName: \"kubernetes.io/projected/0b19426b-81a4-4498-9754-948e8b7154d9-kube-api-access-s2g97\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b19426b-81a4-4498-9754-948e8b7154d9-logs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b19426b-81a4-4498-9754-948e8b7154d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712096 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-scripts\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.752449 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2g97\" (UniqueName: \"kubernetes.io/projected/0b19426b-81a4-4498-9754-948e8b7154d9-kube-api-access-s2g97\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b19426b-81a4-4498-9754-948e8b7154d9-logs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b19426b-81a4-4498-9754-948e8b7154d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814543 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-scripts\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b19426b-81a4-4498-9754-948e8b7154d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.815371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b19426b-81a4-4498-9754-948e8b7154d9-logs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.821505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.822127 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-scripts\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.822604 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.824211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.825889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.826143 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.837334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2g97\" (UniqueName: \"kubernetes.io/projected/0b19426b-81a4-4498-9754-948e8b7154d9-kube-api-access-s2g97\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.037822 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.205050 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:48 crc kubenswrapper[4795]: W0320 17:38:48.315080 4795 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb13fc8ca_8012_4cf1_bbe4_b83ff3eb3b8d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb13fc8ca_8012_4cf1_bbe4_b83ff3eb3b8d.slice: no such file or directory Mar 20 17:38:48 crc kubenswrapper[4795]: W0320 17:38:48.318400 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d348053_c6a6_462c_9e8d_5ff55140a554.slice/crio-7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220.scope WatchSource:0}: Error finding container 7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220: Status 404 returned error can't find the container with id 7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220 Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.319740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"a117c1a1ab5b289c10e51e9701c1262a6f901fb4294e9dacf76136a47b3ab85d"} Mar 20 17:38:48 crc kubenswrapper[4795]: W0320 17:38:48.319846 4795 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296aa0d1_17fa_44da_8868_1ebb0006c417.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296aa0d1_17fa_44da_8868_1ebb0006c417.slice: no such file or directory Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.320938 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.324789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerStarted","Data":"94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c"} Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.330446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" event={"ID":"faa8c15c-b759-4db8-ac4d-28648a8cfde2","Type":"ContainerStarted","Data":"6b2a655141d0067e1003f83ced2c6fba2714974a090bed13757747c825f7e9e2"} Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.343973 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" exitCode=137 Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.346421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerDied","Data":"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012"} Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.361326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" podStartSLOduration=2.548760871 podStartE2EDuration="5.361307329s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="2026-03-20 17:38:44.107991318 +0000 UTC m=+1267.566022859" lastFinishedPulling="2026-03-20 17:38:46.920537776 +0000 UTC m=+1270.378569317" observedRunningTime="2026-03-20 17:38:48.34891343 +0000 UTC m=+1271.806944971" watchObservedRunningTime="2026-03-20 17:38:48.361307329 +0000 UTC m=+1271.819338870" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.788027 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841634 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841875 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.842393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs" (OuterVolumeSpecName: "logs") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.849926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.884139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data" (OuterVolumeSpecName: "config-data") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.891722 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk" (OuterVolumeSpecName: "kube-api-access-vmwrk") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "kube-api-access-vmwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.932001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts" (OuterVolumeSpecName: "scripts") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944068 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944096 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944105 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944113 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944121 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.031670 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.146914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.146946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.146977 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs" (OuterVolumeSpecName: "logs") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147919 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.150813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.150962 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk" (OuterVolumeSpecName: "kube-api-access-pwnmk") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "kube-api-access-pwnmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.174532 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts" (OuterVolumeSpecName: "scripts") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.175888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data" (OuterVolumeSpecName: "config-data") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.253859 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.254403 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.254587 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.254770 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.273792 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" path="/var/lib/kubelet/pods/296aa0d1-17fa-44da-8868-1ebb0006c417/volumes" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.274834 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" path="/var/lib/kubelet/pods/5d348053-c6a6-462c-9e8d-5ff55140a554/volumes" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.394068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerStarted","Data":"9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.396998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b19426b-81a4-4498-9754-948e8b7154d9","Type":"ContainerStarted","Data":"7ab088ee63c56a27e095e3cd75fb5a4335bff783c050d610b3df68526323e45b"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.397047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b19426b-81a4-4498-9754-948e8b7154d9","Type":"ContainerStarted","Data":"c7a852dd130a3bede4b58f63940fd0bea93400b031cae542dee56b65c34701a8"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409378 4795 generic.go:334] "Generic (PLEG): container finished" podID="d149d116-1195-403f-9546-5b79d24e666d" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" exitCode=137 Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409751 4795 generic.go:334] "Generic (PLEG): container finished" podID="d149d116-1195-403f-9546-5b79d24e666d" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" exitCode=137 Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerDied","Data":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerDied","Data":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerDied","Data":"02a106435121a29bb7e883006bb45d54dcf75dcebd8e8d213a1788cfe4f4db42"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409913 4795 scope.go:117] "RemoveContainer" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.410093 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422307 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" exitCode=137 Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422406 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerDied","Data":"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerDied","Data":"715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422497 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.424815 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.583010201 podStartE2EDuration="6.424799639s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="2026-03-20 17:38:44.368322856 +0000 UTC m=+1267.826354397" lastFinishedPulling="2026-03-20 17:38:47.210112274 +0000 UTC m=+1270.668143835" observedRunningTime="2026-03-20 17:38:49.41718589 +0000 UTC m=+1272.875217441" watchObservedRunningTime="2026-03-20 17:38:49.424799639 +0000 UTC m=+1272.882831180" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.425229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.459560 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.476771 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.485819 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.496823 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.503748 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84776bb8f8-wkk7m"] Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504137 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504152 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504166 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504172 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504190 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504196 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504378 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504402 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504412 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504422 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.505342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.512254 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.512670 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84776bb8f8-wkk7m"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.513187 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-combined-ca-bundle\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-logs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-internal-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data-custom\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.563047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h7k9\" (UniqueName: \"kubernetes.io/projected/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-kube-api-access-2h7k9\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.563097 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-public-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.563135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.617143 4795 scope.go:117] "RemoveContainer" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664764 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-internal-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664842 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data-custom\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664895 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h7k9\" (UniqueName: \"kubernetes.io/projected/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-kube-api-access-2h7k9\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-public-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.665003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-combined-ca-bundle\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.665030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-logs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.665565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-logs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.666883 4795 scope.go:117] "RemoveContainer" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.667718 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": container with ID starting with 109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d not found: ID does not exist" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.667804 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} err="failed to get container status \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": rpc error: code = NotFound desc = could not find container \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": container with ID starting with 109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.667838 4795 scope.go:117] "RemoveContainer" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.668465 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": container with ID starting with 71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449 not found: ID does not exist" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.668520 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} err="failed to get container status \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": rpc error: code = NotFound desc = could not find container \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": container with ID starting with 71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449 not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.668545 4795 scope.go:117] "RemoveContainer" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.669376 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} err="failed to get container status \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": rpc error: code = NotFound desc = could not find container \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": container with ID starting with 109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.669402 4795 scope.go:117] "RemoveContainer" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.671679 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} err="failed to get container status \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": rpc error: code = NotFound desc = could not find container \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": container with ID starting with 71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449 not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.671731 4795 scope.go:117] "RemoveContainer" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.674673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-internal-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.674991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-public-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.675296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data-custom\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.676351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-combined-ca-bundle\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.680894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.684022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h7k9\" (UniqueName: \"kubernetes.io/projected/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-kube-api-access-2h7k9\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.839513 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.886653 4795 scope.go:117] "RemoveContainer" containerID="616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.050104 4795 scope.go:117] "RemoveContainer" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.256661 4795 scope.go:117] "RemoveContainer" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" Mar 20 17:38:50 crc kubenswrapper[4795]: E0320 17:38:50.260810 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012\": container with ID starting with 8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012 not found: ID does not exist" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.260856 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012"} err="failed to get container status \"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012\": rpc error: code = NotFound desc = could not find container \"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012\": container with ID starting with 8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012 not found: ID does not exist" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.260878 4795 scope.go:117] "RemoveContainer" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" Mar 20 17:38:50 crc kubenswrapper[4795]: E0320 17:38:50.263888 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686\": container with ID starting with fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686 not found: ID does not exist" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.263916 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686"} err="failed to get container status \"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686\": rpc error: code = NotFound desc = could not find container \"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686\": container with ID starting with fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686 not found: ID does not exist" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.491555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.533740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b19426b-81a4-4498-9754-948e8b7154d9","Type":"ContainerStarted","Data":"8f9d0c257765e53f7c8c766f6912e7a2acee2cd25b9c5dbd8ab6d4c65c12b918"} Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.533806 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.564319 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.564301958 podStartE2EDuration="3.564301958s" podCreationTimestamp="2026-03-20 17:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:50.559016462 +0000 UTC m=+1274.017048003" watchObservedRunningTime="2026-03-20 17:38:50.564301958 +0000 UTC m=+1274.022333499" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.813904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84776bb8f8-wkk7m"] Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.929463 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.092403 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.275023 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" path="/var/lib/kubelet/pods/7a1074ea-5432-46f8-ba74-7c68912c68b6/volumes" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.275909 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d149d116-1195-403f-9546-5b79d24e666d" path="/var/lib/kubelet/pods/d149d116-1195-403f-9546-5b79d24e666d/volumes" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84776bb8f8-wkk7m" event={"ID":"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97","Type":"ContainerStarted","Data":"d2bcafbd4b3c6341a57ef7acb3a881cc34da497929dae0864d611cae93dcdfec"} Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84776bb8f8-wkk7m" event={"ID":"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97","Type":"ContainerStarted","Data":"1ad175aaa2232ed6e1ae62f17a7bec2babde25162a97cd846f8f93d181f436af"} Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543641 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84776bb8f8-wkk7m" event={"ID":"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97","Type":"ContainerStarted","Data":"4fc057768ef8270240a33fb541dacaefe17d9040c8ebf8706a7a941e117a701a"} Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.546456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.795198 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.855496 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.861518 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84776bb8f8-wkk7m" podStartSLOduration=3.861486146 podStartE2EDuration="3.861486146s" podCreationTimestamp="2026-03-20 17:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:51.582226257 +0000 UTC m=+1275.040257798" watchObservedRunningTime="2026-03-20 17:38:52.861486146 +0000 UTC m=+1276.319517727" Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.913392 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.569216 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" containerID="cri-o://da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3" gracePeriod=30 Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.569281 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" containerID="cri-o://5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb" gracePeriod=30 Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.864459 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.942487 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.026317 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.027316 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" containerID="cri-o://076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" gracePeriod=10 Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.146510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.514822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.578375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.578892 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.604860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq" (OuterVolumeSpecName: "kube-api-access-6csjq") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "kube-api-access-6csjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605056 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" exitCode=0 Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerDied","Data":"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773"} Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerDied","Data":"af377878f795441c98f680067ea533216f89a8059101023b972799ab26727a8a"} Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605928 4795 scope.go:117] "RemoveContainer" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.623788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.624093 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.658216 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.263269974 podStartE2EDuration="7.658198741s" podCreationTimestamp="2026-03-20 17:38:47 +0000 UTC" firstStartedPulling="2026-03-20 17:38:48.379440449 +0000 UTC m=+1271.837471990" lastFinishedPulling="2026-03-20 17:38:53.774369105 +0000 UTC m=+1277.232400757" observedRunningTime="2026-03-20 17:38:54.641558539 +0000 UTC m=+1278.099590080" watchObservedRunningTime="2026-03-20 17:38:54.658198741 +0000 UTC m=+1278.116230282" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.660976 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.661746 4795 scope.go:117] "RemoveContainer" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.664708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.680883 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config" (OuterVolumeSpecName: "config") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683112 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683133 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683145 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683154 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.709631 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.712159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.718862 4795 scope.go:117] "RemoveContainer" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" Mar 20 17:38:54 crc kubenswrapper[4795]: E0320 17:38:54.719261 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773\": container with ID starting with 076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773 not found: ID does not exist" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.719289 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773"} err="failed to get container status \"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773\": rpc error: code = NotFound desc = could not find container \"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773\": container with ID starting with 076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773 not found: ID does not exist" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.719312 4795 scope.go:117] "RemoveContainer" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" Mar 20 17:38:54 crc kubenswrapper[4795]: E0320 17:38:54.719753 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163\": container with ID starting with 4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163 not found: ID does not exist" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.719776 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163"} err="failed to get container status \"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163\": rpc error: code = NotFound desc = could not find container \"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163\": container with ID starting with 4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163 not found: ID does not exist" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.720193 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.794489 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.794523 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.945786 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.952346 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.268644 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" path="/var/lib/kubelet/pods/7a77884f-5f74-473c-9875-d7afc62ab2f5/volumes" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.490006 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.491075 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.630000 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" containerID="cri-o://94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c" gracePeriod=30 Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.630093 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" containerID="cri-o://9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c" gracePeriod=30 Mar 20 17:38:56 crc kubenswrapper[4795]: I0320 17:38:56.641191 4795 generic.go:334] "Generic (PLEG): container finished" podID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerID="9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c" exitCode=0 Mar 20 17:38:56 crc kubenswrapper[4795]: I0320 17:38:56.641443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerDied","Data":"9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c"} Mar 20 17:38:57 crc kubenswrapper[4795]: I0320 17:38:57.655773 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerID="5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb" exitCode=0 Mar 20 17:38:57 crc kubenswrapper[4795]: I0320 17:38:57.655842 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerDied","Data":"5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb"} Mar 20 17:38:58 crc kubenswrapper[4795]: I0320 17:38:58.778187 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.351598 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.689133 4795 generic.go:334] "Generic (PLEG): container finished" podID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerID="94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c" exitCode=0 Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.689220 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerDied","Data":"94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c"} Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.949043 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992829 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.996758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.005886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.010860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg" (OuterVolumeSpecName: "kube-api-access-b6gqg") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "kube-api-access-b6gqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.021296 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts" (OuterVolumeSpecName: "scripts") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.094802 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.095083 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.095093 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.095101 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.117942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.157180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data" (OuterVolumeSpecName: "config-data") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.183040 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.197046 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.197074 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.698167 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerDied","Data":"c8191878b7b3ac40636daf5acafd14c09be03b0e522e52658ab845e1056e98aa"} Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.698213 4795 scope.go:117] "RemoveContainer" containerID="9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.698326 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.725356 4795 scope.go:117] "RemoveContainer" containerID="94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.736176 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.750961 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.778799 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779208 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779221 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779237 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779244 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779271 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="init" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="init" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779464 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779473 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779480 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.780361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.783354 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.805742 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmd5w\" (UniqueName: \"kubernetes.io/projected/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-kube-api-access-gmd5w\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807702 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807808 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-scripts\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-scripts\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmd5w\" (UniqueName: \"kubernetes.io/projected/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-kube-api-access-gmd5w\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.911009 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.914634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.918467 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.919245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.919619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-scripts\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.947531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmd5w\" (UniqueName: \"kubernetes.io/projected/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-kube-api-access-gmd5w\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.105198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.269523 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" path="/var/lib/kubelet/pods/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0/volumes" Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.644322 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.709713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620","Type":"ContainerStarted","Data":"fc2d0c41a206352f47f3afd4c5dfbcbe68ff3f14b45245d3886e49cda2910357"} Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.004975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.179471 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.182187 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.257154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.323285 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.323552 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" containerID="cri-o://390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" gracePeriod=30 Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.323940 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" containerID="cri-o://195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" gracePeriod=30 Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.463937 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.466243 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.471133 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.471318 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s2zd2" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.471430 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.485807 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.653587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.654030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.654073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6b2\" (UniqueName: \"kubernetes.io/projected/cf3f8aea-393e-418a-ad14-2848c8df93e9-kube-api-access-2t6b2\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.654132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.725738 4795 generic.go:334] "Generic (PLEG): container finished" podID="a4150989-c1d2-4afd-b815-cda32fec2835" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" exitCode=143 Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.725794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerDied","Data":"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16"} Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.728590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620","Type":"ContainerStarted","Data":"83913acd00457440fcb89cd92b47bfecfaf55dcff6920c9989fe7d6025e926e9"} Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6b2\" (UniqueName: \"kubernetes.io/projected/cf3f8aea-393e-418a-ad14-2848c8df93e9-kube-api-access-2t6b2\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.758049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.773418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6b2\" (UniqueName: \"kubernetes.io/projected/cf3f8aea-393e-418a-ad14-2848c8df93e9-kube-api-access-2t6b2\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.784798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.787756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.809951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.248820 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.313676 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.314212 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-575df674dd-5xp2t" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" containerID="cri-o://d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" gracePeriod=30 Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.314350 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-575df674dd-5xp2t" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" containerID="cri-o://41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" gracePeriod=30 Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.353368 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.737962 4795 generic.go:334] "Generic (PLEG): container finished" podID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" exitCode=0 Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.738043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerDied","Data":"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476"} Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.739948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620","Type":"ContainerStarted","Data":"b696d40be59c077e289a843dab61408c60fd455055ae2c79762077e7b82d0ff7"} Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.741495 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cf3f8aea-393e-418a-ad14-2848c8df93e9","Type":"ContainerStarted","Data":"16a9c60148a3b16e42c123935815cea94f189aeb0c36189715acf27ac5f72666"} Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.762360 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.762343726 podStartE2EDuration="3.762343726s" podCreationTimestamp="2026-03-20 17:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:03.760048864 +0000 UTC m=+1287.218080405" watchObservedRunningTime="2026-03-20 17:39:03.762343726 +0000 UTC m=+1287.220375267" Mar 20 17:39:05 crc kubenswrapper[4795]: I0320 17:39:05.797485 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:34676->10.217.0.166:9311: read: connection reset by peer" Mar 20 17:39:05 crc kubenswrapper[4795]: I0320 17:39:05.799223 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:34674->10.217.0.166:9311: read: connection reset by peer" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.105830 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.225794 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349819 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.351983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs" (OuterVolumeSpecName: "logs") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.358743 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.379704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5" (OuterVolumeSpecName: "kube-api-access-5xvp5") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "kube-api-access-5xvp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.388882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.410784 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data" (OuterVolumeSpecName: "config-data") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.455877 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456007 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456035 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456046 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456055 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776099 4795 generic.go:334] "Generic (PLEG): container finished" podID="a4150989-c1d2-4afd-b815-cda32fec2835" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" exitCode=0 Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerDied","Data":"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805"} Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerDied","Data":"57c443aca3511d65cb1758f7e520aeb9b66168af6d8e810d947b3d385977aa2d"} Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776207 4795 scope.go:117] "RemoveContainer" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776320 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.811028 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.816262 4795 scope.go:117] "RemoveContainer" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.818051 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.842900 4795 scope.go:117] "RemoveContainer" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.843413 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805\": container with ID starting with 195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805 not found: ID does not exist" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.843459 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805"} err="failed to get container status \"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805\": rpc error: code = NotFound desc = could not find container \"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805\": container with ID starting with 195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805 not found: ID does not exist" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.843488 4795 scope.go:117] "RemoveContainer" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.843838 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16\": container with ID starting with 390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16 not found: ID does not exist" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.843878 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16"} err="failed to get container status \"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16\": rpc error: code = NotFound desc = could not find container \"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16\": container with ID starting with 390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16 not found: ID does not exist" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.977694 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6697f55ff5-fj55x"] Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.978017 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.978055 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978062 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978229 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978249 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.983152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.987665 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.999430 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.999613 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.006607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6697f55ff5-fj55x"] Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-public-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-run-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-config-data\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxnw\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-kube-api-access-ndxnw\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-log-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-etc-swift\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-combined-ca-bundle\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-internal-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.262254 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" path="/var/lib/kubelet/pods/a4150989-c1d2-4afd-b815-cda32fec2835/volumes" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269471 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-config-data\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxnw\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-kube-api-access-ndxnw\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-log-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269588 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-etc-swift\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-combined-ca-bundle\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-internal-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-public-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-run-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.270070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-log-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.270986 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-run-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.276604 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-config-data\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.288648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-etc-swift\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.289080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-combined-ca-bundle\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.290192 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-internal-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.294373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxnw\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-kube-api-access-ndxnw\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.294677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-public-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.305095 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.619401 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620193 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" containerID="cri-o://bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620282 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" containerID="cri-o://b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620305 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" containerID="cri-o://e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620337 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" containerID="cri-o://807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.634785 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.796010 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" exitCode=2 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.796156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.855761 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6697f55ff5-fj55x"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.444761 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611557 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.612323 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.612584 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.615342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b" (OuterVolumeSpecName: "kube-api-access-vrr2b") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "kube-api-access-vrr2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.617810 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts" (OuterVolumeSpecName: "scripts") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.645050 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.689256 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.710129 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data" (OuterVolumeSpecName: "config-data") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713206 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713859 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713881 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713893 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713922 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713931 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713939 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.779051 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810022 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" exitCode=0 Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810054 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" exitCode=0 Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810066 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" exitCode=0 Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810097 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"a117c1a1ab5b289c10e51e9701c1262a6f901fb4294e9dacf76136a47b3ab85d"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810251 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6697f55ff5-fj55x" event={"ID":"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6","Type":"ContainerStarted","Data":"cecdd00c873c6d628dd822fd5569140a0d1f2830367ac85f8e835e186166f659"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6697f55ff5-fj55x" event={"ID":"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6","Type":"ContainerStarted","Data":"c8a0719b6dee041d8bdac2720159d6c04c39cca376780f0e5f7462217dfff21b"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6697f55ff5-fj55x" event={"ID":"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6","Type":"ContainerStarted","Data":"1839cb46b9a07d6e5f4fb99ff3cc20b86bec8297ab56d804ac648323238e522a"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812722 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.847569 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6697f55ff5-fj55x" podStartSLOduration=2.847549993 podStartE2EDuration="2.847549993s" podCreationTimestamp="2026-03-20 17:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:08.83056333 +0000 UTC m=+1292.288594871" watchObservedRunningTime="2026-03-20 17:39:08.847549993 +0000 UTC m=+1292.305581534" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.847734 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.872663 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.885802 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.887433 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902128 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902586 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902602 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902616 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902622 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902668 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902676 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902699 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902705 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902868 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902889 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902903 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902914 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.904594 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.909395 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.909719 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.913565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.965798 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.985264 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.985777 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.985826 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} err="failed to get container status \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.985875 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.986351 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986392 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} err="failed to get container status \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986419 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.986736 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986779 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} err="failed to get container status \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986795 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.987178 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987204 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} err="failed to get container status \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987220 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987439 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} err="failed to get container status \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987458 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.998638 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} err="failed to get container status \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.998672 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999113 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} err="failed to get container status \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999144 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999515 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} err="failed to get container status \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999538 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999767 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} err="failed to get container status \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999793 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000048 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} err="failed to get container status \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000069 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000343 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} err="failed to get container status \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000447 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000658 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} err="failed to get container status \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018078 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018179 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018258 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.120028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.122210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.135204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.136476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.142197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.142246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.147842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.257230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.263333 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" path="/var/lib/kubelet/pods/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c/volumes" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.521908 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.631907 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.631966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.632031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.632063 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.632109 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.642913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h" (OuterVolumeSpecName: "kube-api-access-h2x7h") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "kube-api-access-h2x7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.645966 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.699188 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config" (OuterVolumeSpecName: "config") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.717888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.733969 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.734000 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.734010 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.734018 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.735389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.839427 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.841041 4795 generic.go:334] "Generic (PLEG): container finished" podID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" exitCode=0 Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842207 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerDied","Data":"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a"} Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842792 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerDied","Data":"6b6c595fe74467a83b78631dcaec9938772f82ba49da938ae37e739e51dd0a38"} Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842831 4795 scope.go:117] "RemoveContainer" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.844139 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.889757 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.899657 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:39:11 crc kubenswrapper[4795]: I0320 17:39:11.271285 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" path="/var/lib/kubelet/pods/8cce4da2-83af-4f8a-9923-d618bd8a9225/volumes" Mar 20 17:39:11 crc kubenswrapper[4795]: I0320 17:39:11.337460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.785662 4795 scope.go:117] "RemoveContainer" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.845447 4795 scope.go:117] "RemoveContainer" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" Mar 20 17:39:14 crc kubenswrapper[4795]: E0320 17:39:14.845981 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476\": container with ID starting with 41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476 not found: ID does not exist" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.846031 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476"} err="failed to get container status \"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476\": rpc error: code = NotFound desc = could not find container \"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476\": container with ID starting with 41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476 not found: ID does not exist" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.846064 4795 scope.go:117] "RemoveContainer" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" Mar 20 17:39:14 crc kubenswrapper[4795]: E0320 17:39:14.846391 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a\": container with ID starting with d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a not found: ID does not exist" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.846433 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a"} err="failed to get container status \"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a\": rpc error: code = NotFound desc = could not find container \"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a\": container with ID starting with d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a not found: ID does not exist" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.909039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"7d09de57ae215285b7a1c023830c93b72b04b973baabd25b25f676f4a51305aa"} Mar 20 17:39:15 crc kubenswrapper[4795]: I0320 17:39:15.920930 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cf3f8aea-393e-418a-ad14-2848c8df93e9","Type":"ContainerStarted","Data":"b7ca66e3e1d5493aad87395606a801a678c10a32acb9500132879e8bdf155903"} Mar 20 17:39:15 crc kubenswrapper[4795]: I0320 17:39:15.924544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} Mar 20 17:39:15 crc kubenswrapper[4795]: I0320 17:39:15.944755 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.470058321 podStartE2EDuration="13.944734328s" podCreationTimestamp="2026-03-20 17:39:02 +0000 UTC" firstStartedPulling="2026-03-20 17:39:03.375664348 +0000 UTC m=+1286.833695889" lastFinishedPulling="2026-03-20 17:39:14.850340355 +0000 UTC m=+1298.308371896" observedRunningTime="2026-03-20 17:39:15.936826108 +0000 UTC m=+1299.394857659" watchObservedRunningTime="2026-03-20 17:39:15.944734328 +0000 UTC m=+1299.402765869" Mar 20 17:39:16 crc kubenswrapper[4795]: I0320 17:39:16.933746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} Mar 20 17:39:16 crc kubenswrapper[4795]: I0320 17:39:16.934798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.313615 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:39:17 crc kubenswrapper[4795]: E0320 17:39:17.314004 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314021 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" Mar 20 17:39:17 crc kubenswrapper[4795]: E0320 17:39:17.314043 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314051 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314216 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314240 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.317123 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.326572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.347389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.387325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.388022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.420524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.421597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.431892 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.433028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.434818 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.449835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.461732 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.489621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.489856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.489941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490999 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.507429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.592368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.592699 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.607031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.614486 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.616228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.649936 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.655760 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.661251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.697032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.697145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.720011 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.722057 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.731087 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.731296 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.744565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.753411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.800636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.800940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.800969 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.801022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.801455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.838673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.851730 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.852827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.858877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.869575 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902729 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.903328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.942447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.006386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.007365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.037246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.043258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.049167 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.072135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.210311 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.293341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.357262 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.374995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:39:18 crc kubenswrapper[4795]: W0320 17:39:18.385436 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d1dfe60_98b0_4644_b063_831293f9bd5c.slice/crio-23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e WatchSource:0}: Error finding container 23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e: Status 404 returned error can't find the container with id 23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.595864 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.612706 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.689157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:39:18 crc kubenswrapper[4795]: W0320 17:39:18.707613 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a51797_b6d0_4b5b_9927_54d4b965469e.slice/crio-4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248 WatchSource:0}: Error finding container 4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248: Status 404 returned error can't find the container with id 4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.778494 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.778720 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.954672 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerID="f4b5007e4a1309d08572b5c31e2719d5a9d1e8abc1f29797304920c21729de14" exitCode=0 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.954727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fqwkd" event={"ID":"7d1dfe60-98b0-4644-b063-831293f9bd5c","Type":"ContainerDied","Data":"f4b5007e4a1309d08572b5c31e2719d5a9d1e8abc1f29797304920c21729de14"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.955104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fqwkd" event={"ID":"7d1dfe60-98b0-4644-b063-831293f9bd5c","Type":"ContainerStarted","Data":"23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.956064 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wd5n7" event={"ID":"7bd6f80f-7908-42b5-b32a-63d585bd9194","Type":"ContainerStarted","Data":"970949fc704fe767a632d55a42c62ddb8f7a120a1f8f4ea713ca42632765cd3e"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.958553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.958756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.964864 4795 generic.go:334] "Generic (PLEG): container finished" podID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerID="b63d10a82b890eac4b6bd4726e08b48ae844a0390be4307858d97c75a41d914f" exitCode=0 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.965065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4l89c" event={"ID":"efc90399-0b15-4fc6-b441-d7df6925c8aa","Type":"ContainerDied","Data":"b63d10a82b890eac4b6bd4726e08b48ae844a0390be4307858d97c75a41d914f"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.965108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4l89c" event={"ID":"efc90399-0b15-4fc6-b441-d7df6925c8aa","Type":"ContainerStarted","Data":"700d362b8c96896ed216b0973f95f5cdb189f63903ab165abd13610fbb69975d"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.970098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3748-account-create-update-j2khv" event={"ID":"9f5daae9-920d-496a-ad6a-c016cfb82250","Type":"ContainerStarted","Data":"bf45615034cdb988e7a4ea9c726e6ec289a16729b8bfc6970ea397d199a8e5a3"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.974610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-100d-account-create-update-7l925" event={"ID":"65a51797-b6d0-4b5b-9927-54d4b965469e","Type":"ContainerStarted","Data":"4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.982662 4795 generic.go:334] "Generic (PLEG): container finished" podID="e42b654e-f003-45dd-a7c4-07655514643e" containerID="d5e2a993de0ac2a73d513cdc5305eaa4c6be7243356c29c7534462e994b17675" exitCode=0 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.982715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7051-account-create-update-d2d7p" event={"ID":"e42b654e-f003-45dd-a7c4-07655514643e","Type":"ContainerDied","Data":"d5e2a993de0ac2a73d513cdc5305eaa4c6be7243356c29c7534462e994b17675"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.982737 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7051-account-create-update-d2d7p" event={"ID":"e42b654e-f003-45dd-a7c4-07655514643e","Type":"ContainerStarted","Data":"776a14a7a36db1bed300198a0a46287925190b675f42aee4a98e85192fcad1df"} Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.029527 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.246106239 podStartE2EDuration="11.029510868s" podCreationTimestamp="2026-03-20 17:39:08 +0000 UTC" firstStartedPulling="2026-03-20 17:39:14.786274253 +0000 UTC m=+1298.244305834" lastFinishedPulling="2026-03-20 17:39:18.569678932 +0000 UTC m=+1302.027710463" observedRunningTime="2026-03-20 17:39:19.013490794 +0000 UTC m=+1302.471522335" watchObservedRunningTime="2026-03-20 17:39:19.029510868 +0000 UTC m=+1302.487542409" Mar 20 17:39:19 crc kubenswrapper[4795]: E0320 17:39:19.602383 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a51797_b6d0_4b5b_9927_54d4b965469e.slice/crio-conmon-43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.992539 4795 generic.go:334] "Generic (PLEG): container finished" podID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerID="43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951" exitCode=0 Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.992607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-100d-account-create-update-7l925" event={"ID":"65a51797-b6d0-4b5b-9927-54d4b965469e","Type":"ContainerDied","Data":"43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951"} Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.994427 4795 generic.go:334] "Generic (PLEG): container finished" podID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerID="21aac3ceb6dfd938908085675b810b1f95e9aaa0d7afc715430454788951ca0a" exitCode=0 Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.994511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wd5n7" event={"ID":"7bd6f80f-7908-42b5-b32a-63d585bd9194","Type":"ContainerDied","Data":"21aac3ceb6dfd938908085675b810b1f95e9aaa0d7afc715430454788951ca0a"} Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.996635 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerID="1a5ed57a211fe9b0c1882f91516bfc8da29711316e391c1c87ed18df2cb6cc36" exitCode=0 Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.996760 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3748-account-create-update-j2khv" event={"ID":"9f5daae9-920d-496a-ad6a-c016cfb82250","Type":"ContainerDied","Data":"1a5ed57a211fe9b0c1882f91516bfc8da29711316e391c1c87ed18df2cb6cc36"} Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.415185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.461859 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.462155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"efc90399-0b15-4fc6-b441-d7df6925c8aa\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.462199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"efc90399-0b15-4fc6-b441-d7df6925c8aa\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.465166 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efc90399-0b15-4fc6-b441-d7df6925c8aa" (UID: "efc90399-0b15-4fc6-b441-d7df6925c8aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.480358 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9" (OuterVolumeSpecName: "kube-api-access-xkgh9") pod "efc90399-0b15-4fc6-b441-d7df6925c8aa" (UID: "efc90399-0b15-4fc6-b441-d7df6925c8aa"). InnerVolumeSpecName "kube-api-access-xkgh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.523279 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.527506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.564903 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.564944 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666205 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"e42b654e-f003-45dd-a7c4-07655514643e\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"e42b654e-f003-45dd-a7c4-07655514643e\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"7d1dfe60-98b0-4644-b063-831293f9bd5c\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666427 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"7d1dfe60-98b0-4644-b063-831293f9bd5c\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.667340 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d1dfe60-98b0-4644-b063-831293f9bd5c" (UID: "7d1dfe60-98b0-4644-b063-831293f9bd5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.667396 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e42b654e-f003-45dd-a7c4-07655514643e" (UID: "e42b654e-f003-45dd-a7c4-07655514643e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.676231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8" (OuterVolumeSpecName: "kube-api-access-s5zc8") pod "7d1dfe60-98b0-4644-b063-831293f9bd5c" (UID: "7d1dfe60-98b0-4644-b063-831293f9bd5c"). InnerVolumeSpecName "kube-api-access-s5zc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.676446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz" (OuterVolumeSpecName: "kube-api-access-5thvz") pod "e42b654e-f003-45dd-a7c4-07655514643e" (UID: "e42b654e-f003-45dd-a7c4-07655514643e"). InnerVolumeSpecName "kube-api-access-5thvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768816 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768850 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768861 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768869 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.006974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4l89c" event={"ID":"efc90399-0b15-4fc6-b441-d7df6925c8aa","Type":"ContainerDied","Data":"700d362b8c96896ed216b0973f95f5cdb189f63903ab165abd13610fbb69975d"} Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.007011 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="700d362b8c96896ed216b0973f95f5cdb189f63903ab165abd13610fbb69975d" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.007054 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.009717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7051-account-create-update-d2d7p" event={"ID":"e42b654e-f003-45dd-a7c4-07655514643e","Type":"ContainerDied","Data":"776a14a7a36db1bed300198a0a46287925190b675f42aee4a98e85192fcad1df"} Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.009764 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="776a14a7a36db1bed300198a0a46287925190b675f42aee4a98e85192fcad1df" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.009828 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fqwkd" event={"ID":"7d1dfe60-98b0-4644-b063-831293f9bd5c","Type":"ContainerDied","Data":"23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e"} Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020194 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020431 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020632 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" containerID="cri-o://47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020788 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" containerID="cri-o://b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020853 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" containerID="cri-o://6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020914 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" containerID="cri-o://71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.336920 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.385378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"7bd6f80f-7908-42b5-b32a-63d585bd9194\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.385506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"7bd6f80f-7908-42b5-b32a-63d585bd9194\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.387427 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bd6f80f-7908-42b5-b32a-63d585bd9194" (UID: "7bd6f80f-7908-42b5-b32a-63d585bd9194"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.398842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg" (OuterVolumeSpecName: "kube-api-access-kntrg") pod "7bd6f80f-7908-42b5-b32a-63d585bd9194" (UID: "7bd6f80f-7908-42b5-b32a-63d585bd9194"). InnerVolumeSpecName "kube-api-access-kntrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.487272 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.487517 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.646706 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.697460 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"9f5daae9-920d-496a-ad6a-c016cfb82250\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.697536 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"9f5daae9-920d-496a-ad6a-c016cfb82250\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.698415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f5daae9-920d-496a-ad6a-c016cfb82250" (UID: "9f5daae9-920d-496a-ad6a-c016cfb82250"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.706858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg" (OuterVolumeSpecName: "kube-api-access-kfrfg") pod "9f5daae9-920d-496a-ad6a-c016cfb82250" (UID: "9f5daae9-920d-496a-ad6a-c016cfb82250"). InnerVolumeSpecName "kube-api-access-kfrfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.750021 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.799751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"65a51797-b6d0-4b5b-9927-54d4b965469e\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.799810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"65a51797-b6d0-4b5b-9927-54d4b965469e\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.800436 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.800456 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.801072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65a51797-b6d0-4b5b-9927-54d4b965469e" (UID: "65a51797-b6d0-4b5b-9927-54d4b965469e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.808201 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr" (OuterVolumeSpecName: "kube-api-access-z26gr") pod "65a51797-b6d0-4b5b-9927-54d4b965469e" (UID: "65a51797-b6d0-4b5b-9927-54d4b965469e"). InnerVolumeSpecName "kube-api-access-z26gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.901693 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.901719 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.923324 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003444 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003680 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003991 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.004290 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.004309 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.009327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts" (OuterVolumeSpecName: "scripts") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.013122 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw" (OuterVolumeSpecName: "kube-api-access-tk8fw") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "kube-api-access-tk8fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.030734 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032213 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" exitCode=0 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032242 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" exitCode=2 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032253 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" exitCode=0 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032260 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" exitCode=0 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032629 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032724 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"7d09de57ae215285b7a1c023830c93b72b04b973baabd25b25f676f4a51305aa"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032750 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.044297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wd5n7" event={"ID":"7bd6f80f-7908-42b5-b32a-63d585bd9194","Type":"ContainerDied","Data":"970949fc704fe767a632d55a42c62ddb8f7a120a1f8f4ea713ca42632765cd3e"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.044339 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970949fc704fe767a632d55a42c62ddb8f7a120a1f8f4ea713ca42632765cd3e" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.044401 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.060948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3748-account-create-update-j2khv" event={"ID":"9f5daae9-920d-496a-ad6a-c016cfb82250","Type":"ContainerDied","Data":"bf45615034cdb988e7a4ea9c726e6ec289a16729b8bfc6970ea397d199a8e5a3"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.060992 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf45615034cdb988e7a4ea9c726e6ec289a16729b8bfc6970ea397d199a8e5a3" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.061098 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.071134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-100d-account-create-update-7l925" event={"ID":"65a51797-b6d0-4b5b-9927-54d4b965469e","Type":"ContainerDied","Data":"4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.072070 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.072524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.076077 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.105752 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.107918 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.107996 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.122437 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.134914 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.148417 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.163525 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.163967 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164007 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164034 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.164363 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164384 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164398 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.164637 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164654 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164667 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data" (OuterVolumeSpecName: "config-data") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.165423 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165441 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165455 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165667 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165711 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165943 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165970 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166220 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166429 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166445 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166607 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166621 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166806 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166828 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166968 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166986 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167167 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167185 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167400 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167416 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167552 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167568 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167726 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167743 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167896 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.209910 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.209940 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.399758 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.406753 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.426871 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427474 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427558 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427624 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427676 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427763 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427816 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427877 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42b654e-f003-45dd-a7c4-07655514643e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427931 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42b654e-f003-45dd-a7c4-07655514643e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427990 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428039 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428089 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428136 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428184 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428230 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428348 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428402 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428451 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428509 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428569 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428807 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428878 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42b654e-f003-45dd-a7c4-07655514643e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428935 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428993 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429058 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429116 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429179 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429239 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429294 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429347 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.430765 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.439134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.439417 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.444417 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516334 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.618854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.618948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.618995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619066 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.623486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.624007 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.624303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.631326 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.638337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.746073 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:23 crc kubenswrapper[4795]: I0320 17:39:23.261347 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" path="/var/lib/kubelet/pods/2df26ac7-bc78-4c22-9b4f-f3797a09bd53/volumes" Mar 20 17:39:23 crc kubenswrapper[4795]: W0320 17:39:23.283512 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd66859dd_adc4_48d7_8fad_7f536004b0bb.slice/crio-93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab WatchSource:0}: Error finding container 93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab: Status 404 returned error can't find the container with id 93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab Mar 20 17:39:23 crc kubenswrapper[4795]: I0320 17:39:23.285520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.092622 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerID="da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3" exitCode=137 Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.092678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerDied","Data":"da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3"} Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.094857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757"} Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.094886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab"} Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.547826 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668564 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.669285 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs" (OuterVolumeSpecName: "logs") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.669635 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.676981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq" (OuterVolumeSpecName: "kube-api-access-d6ghq") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "kube-api-access-d6ghq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.693302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.697638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.716124 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts" (OuterVolumeSpecName: "scripts") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.716554 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data" (OuterVolumeSpecName: "config-data") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.723857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771446 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771489 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771506 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771522 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771535 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771545 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.103321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerDied","Data":"b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4"} Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.103383 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.103371 4795 scope.go:117] "RemoveContainer" containerID="5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.109977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b"} Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.140340 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.156851 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.188158 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.188479 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" containerID="cri-o://601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" gracePeriod=30 Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.188858 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" containerID="cri-o://d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" gracePeriod=30 Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.273038 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" path="/var/lib/kubelet/pods/d3e822b2-0b57-4f89-ab29-caeb483457a1/volumes" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.273931 4795 scope.go:117] "RemoveContainer" containerID="da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3" Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.122836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd"} Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.125349 4795 generic.go:334] "Generic (PLEG): container finished" podID="6067c03d-732b-40d9-b017-0365677c39b7" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" exitCode=143 Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.125534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerDied","Data":"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4"} Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.132097 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.132353 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" containerID="cri-o://fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" gracePeriod=30 Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.132480 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" containerID="cri-o://d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" gracePeriod=30 Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.241709 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.139325 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" exitCode=143 Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.139448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerDied","Data":"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006"} Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.871492 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:39:27 crc kubenswrapper[4795]: E0320 17:39:27.872136 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872153 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" Mar 20 17:39:27 crc kubenswrapper[4795]: E0320 17:39:27.872165 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872171 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872318 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872342 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872889 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.875526 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.887223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.887927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.888272 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-c5nq5" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.145611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.145843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.145980 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.146025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.149918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.150592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.151041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e"} Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152380 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152363 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" containerID="cri-o://53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152429 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" containerID="cri-o://00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152458 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" containerID="cri-o://35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152423 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" containerID="cri-o://fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.168397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.188511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.192987 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.526997587 podStartE2EDuration="6.192967997s" podCreationTimestamp="2026-03-20 17:39:22 +0000 UTC" firstStartedPulling="2026-03-20 17:39:23.285865885 +0000 UTC m=+1306.743897426" lastFinishedPulling="2026-03-20 17:39:26.951836295 +0000 UTC m=+1310.409867836" observedRunningTime="2026-03-20 17:39:28.18734053 +0000 UTC m=+1311.645372071" watchObservedRunningTime="2026-03-20 17:39:28.192967997 +0000 UTC m=+1311.650999528" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.643572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.860057 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866384 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866995 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs" (OuterVolumeSpecName: "logs") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.871876 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.880734 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts" (OuterVolumeSpecName: "scripts") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.881052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg" (OuterVolumeSpecName: "kube-api-access-22mcg") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "kube-api-access-22mcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.926507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.936568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data" (OuterVolumeSpecName: "config-data") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.955404 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968053 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968231 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968289 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968342 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968392 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968467 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968535 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968596 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.987812 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.070433 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160815 4795 generic.go:334] "Generic (PLEG): container finished" podID="6067c03d-732b-40d9-b017-0365677c39b7" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" exitCode=0 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerDied","Data":"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerDied","Data":"79719142974a75aa1ceb9ca03ec61b98a42d47f6e27982f5c5a5e0502981ad81"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160913 4795 scope.go:117] "RemoveContainer" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.161015 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.166581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerStarted","Data":"3c4ecc47c641ea00ae26b045f58c4b097551f488bd93d478e0c38b26018a528e"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170428 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e" exitCode=0 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170458 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd" exitCode=2 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170468 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b" exitCode=0 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.275372 4795 scope.go:117] "RemoveContainer" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.289131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.299023 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.311030 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.311574 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.311645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.311729 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.311780 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.312007 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.312069 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.313091 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.316457 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325090 4795 scope.go:117] "RemoveContainer" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325235 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.325563 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad\": container with ID starting with d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad not found: ID does not exist" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325611 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad"} err="failed to get container status \"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad\": rpc error: code = NotFound desc = could not find container \"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad\": container with ID starting with d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad not found: ID does not exist" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325639 4795 scope.go:117] "RemoveContainer" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.325910 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4\": container with ID starting with 601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4 not found: ID does not exist" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325940 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4"} err="failed to get container status \"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4\": rpc error: code = NotFound desc = could not find container \"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4\": container with ID starting with 601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4 not found: ID does not exist" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.326121 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.491916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd45l\" (UniqueName: \"kubernetes.io/projected/264c2db4-1919-41ce-aea3-bd777167a9ca-kube-api-access-pd45l\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.491971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492578 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595973 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd45l\" (UniqueName: \"kubernetes.io/projected/264c2db4-1919-41ce-aea3-bd777167a9ca-kube-api-access-pd45l\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596071 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596318 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.601456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.603101 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.603099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.604425 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.607583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.618310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd45l\" (UniqueName: \"kubernetes.io/projected/264c2db4-1919-41ce-aea3-bd777167a9ca-kube-api-access-pd45l\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.622809 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.710616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.794175 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900190 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900295 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900380 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.901064 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.901092 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.904183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.904729 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs" (OuterVolumeSpecName: "logs") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.905176 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8" (OuterVolumeSpecName: "kube-api-access-d8cs8") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "kube-api-access-d8cs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.907330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.908839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts" (OuterVolumeSpecName: "scripts") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.965014 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.980878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data" (OuterVolumeSpecName: "config-data") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.989873 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003182 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003233 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003246 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003255 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003263 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003271 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003279 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003287 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.021087 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.105195 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192072 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" exitCode=0 Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192112 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerDied","Data":"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619"} Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerDied","Data":"763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c"} Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192160 4795 scope.go:117] "RemoveContainer" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192286 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.232368 4795 scope.go:117] "RemoveContainer" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.253712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.261424 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262109 4795 scope.go:117] "RemoveContainer" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.262476 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619\": container with ID starting with d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619 not found: ID does not exist" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262507 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619"} err="failed to get container status \"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619\": rpc error: code = NotFound desc = could not find container \"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619\": container with ID starting with d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619 not found: ID does not exist" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262526 4795 scope.go:117] "RemoveContainer" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.262838 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006\": container with ID starting with fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006 not found: ID does not exist" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262859 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006"} err="failed to get container status \"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006\": rpc error: code = NotFound desc = could not find container \"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006\": container with ID starting with fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006 not found: ID does not exist" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278303 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.278722 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.278751 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278759 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278904 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278929 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.279757 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.283018 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.283170 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.290351 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.341553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424419 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnm4\" (UniqueName: \"kubernetes.io/projected/81d40eb0-c26d-46e7-b8be-631de2f502b9-kube-api-access-rgnm4\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424458 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-logs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnm4\" (UniqueName: \"kubernetes.io/projected/81d40eb0-c26d-46e7-b8be-631de2f502b9-kube-api-access-rgnm4\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-logs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527180 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-logs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.531302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.533788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.534231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.551583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnm4\" (UniqueName: \"kubernetes.io/projected/81d40eb0-c26d-46e7-b8be-631de2f502b9-kube-api-access-rgnm4\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.551905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.587347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.654202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.201929 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:31 crc kubenswrapper[4795]: W0320 17:39:31.209878 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d40eb0_c26d_46e7_b8be_631de2f502b9.slice/crio-f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5 WatchSource:0}: Error finding container f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5: Status 404 returned error can't find the container with id f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5 Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.212349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"264c2db4-1919-41ce-aea3-bd777167a9ca","Type":"ContainerStarted","Data":"c8d456cb515dde74965a84bbb63a0b4e1a16de133a8a826140f21c0bc2ec153f"} Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.212383 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"264c2db4-1919-41ce-aea3-bd777167a9ca","Type":"ContainerStarted","Data":"f5de7a904eadf145932872ba174825a1efd9acaa99b0bebe8335c2df8a0df661"} Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.269253 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6067c03d-732b-40d9-b017-0365677c39b7" path="/var/lib/kubelet/pods/6067c03d-732b-40d9-b017-0365677c39b7/volumes" Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.270342 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" path="/var/lib/kubelet/pods/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197/volumes" Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.230237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81d40eb0-c26d-46e7-b8be-631de2f502b9","Type":"ContainerStarted","Data":"d6ccf3673a3579c18c8c6ffc094d3d920e67e767ceecf42ab48437e6e88280d7"} Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.230922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81d40eb0-c26d-46e7-b8be-631de2f502b9","Type":"ContainerStarted","Data":"f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5"} Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.232679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"264c2db4-1919-41ce-aea3-bd777167a9ca","Type":"ContainerStarted","Data":"f7051b18aca469ed7b78a3975127f1fc84912f0ba6390c83082798937e7d0697"} Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.248882 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.2488651060000002 podStartE2EDuration="3.248865106s" podCreationTimestamp="2026-03-20 17:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:32.246637607 +0000 UTC m=+1315.704669148" watchObservedRunningTime="2026-03-20 17:39:32.248865106 +0000 UTC m=+1315.706896647" Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.247723 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" exitCode=0 Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.247904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757"} Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.250598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81d40eb0-c26d-46e7-b8be-631de2f502b9","Type":"ContainerStarted","Data":"d29803125c153802162e5de76862336a48b202311af90c926a1dad202c4a61a5"} Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.269891 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.269872952 podStartE2EDuration="3.269872952s" podCreationTimestamp="2026-03-20 17:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:33.268791287 +0000 UTC m=+1316.726822868" watchObservedRunningTime="2026-03-20 17:39:33.269872952 +0000 UTC m=+1316.727904513" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.419666 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568731 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568785 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568828 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568852 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569071 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569789 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.573368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts" (OuterVolumeSpecName: "scripts") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.574893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr" (OuterVolumeSpecName: "kube-api-access-t27qr") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "kube-api-access-t27qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.602469 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.645288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.671410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data" (OuterVolumeSpecName: "config-data") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.671671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672271 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672302 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672322 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672341 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672358 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: W0320 17:39:37.672452 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d66859dd-adc4-48d7-8fad-7f536004b0bb/volumes/kubernetes.io~secret/config-data Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data" (OuterVolumeSpecName: "config-data") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.773811 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.299456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerStarted","Data":"181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e"} Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.304254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab"} Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.304310 4795 scope.go:117] "RemoveContainer" containerID="00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.304314 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.325520 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" podStartSLOduration=2.536129034 podStartE2EDuration="11.325499199s" podCreationTimestamp="2026-03-20 17:39:27 +0000 UTC" firstStartedPulling="2026-03-20 17:39:28.648220389 +0000 UTC m=+1312.106251930" lastFinishedPulling="2026-03-20 17:39:37.437590514 +0000 UTC m=+1320.895622095" observedRunningTime="2026-03-20 17:39:38.318280732 +0000 UTC m=+1321.776312273" watchObservedRunningTime="2026-03-20 17:39:38.325499199 +0000 UTC m=+1321.783530740" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.359984 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.371025 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400121 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400580 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400602 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400619 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400628 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400643 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400652 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400665 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400675 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400954 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400964 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400981 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.402727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.405365 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.405572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.411916 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.485903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.486867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487063 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.590396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.591063 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.594011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.595080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.596103 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.597365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.625241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.725817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.143018 4795 scope.go:117] "RemoveContainer" containerID="fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.173017 4795 scope.go:117] "RemoveContainer" containerID="35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.262785 4795 scope.go:117] "RemoveContainer" containerID="53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.271829 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" path="/var/lib/kubelet/pods/d66859dd-adc4-48d7-8fad-7f536004b0bb/volumes" Mar 20 17:39:39 crc kubenswrapper[4795]: E0320 17:39:39.327870 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757\": container with ID starting with 53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757 not found: ID does not exist" containerID="53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.649054 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:39 crc kubenswrapper[4795]: W0320 17:39:39.664992 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e54cb1b_1f35_4344_899a_395d140ac8c3.slice/crio-bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c WatchSource:0}: Error finding container bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c: Status 404 returned error can't find the container with id bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.711734 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.711781 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.744356 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.758218 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.341734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c"} Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.342130 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.342151 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.654838 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.654887 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.714640 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.726538 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.353612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c"} Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.353820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98"} Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.356045 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.356086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.180786 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.369254 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.369959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2"} Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.390104 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:39:43 crc kubenswrapper[4795]: I0320 17:39:43.264391 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:43 crc kubenswrapper[4795]: I0320 17:39:43.400481 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:39:43 crc kubenswrapper[4795]: I0320 17:39:43.408578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:45 crc kubenswrapper[4795]: I0320 17:39:45.435210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c"} Mar 20 17:39:45 crc kubenswrapper[4795]: I0320 17:39:45.435743 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:45 crc kubenswrapper[4795]: I0320 17:39:45.470027 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.013865799 podStartE2EDuration="7.47001067s" podCreationTimestamp="2026-03-20 17:39:38 +0000 UTC" firstStartedPulling="2026-03-20 17:39:39.668574982 +0000 UTC m=+1323.126606523" lastFinishedPulling="2026-03-20 17:39:45.124719853 +0000 UTC m=+1328.582751394" observedRunningTime="2026-03-20 17:39:45.464072973 +0000 UTC m=+1328.922104514" watchObservedRunningTime="2026-03-20 17:39:45.47001067 +0000 UTC m=+1328.928042211" Mar 20 17:39:47 crc kubenswrapper[4795]: I0320 17:39:47.269124 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465271 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" containerID="cri-o://e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" gracePeriod=30 Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465332 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" containerID="cri-o://2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" gracePeriod=30 Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465429 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" containerID="cri-o://fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" gracePeriod=30 Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465353 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" containerID="cri-o://1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" gracePeriod=30 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.474700 4795 generic.go:334] "Generic (PLEG): container finished" podID="02a8b32b-fab3-401f-b667-592c8840bd97" containerID="181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e" exitCode=0 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.475019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerDied","Data":"181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479449 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" exitCode=0 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479470 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" exitCode=2 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479477 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" exitCode=0 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.856735 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.887908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.887979 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888032 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888121 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888192 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888852 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.894199 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts" (OuterVolumeSpecName: "scripts") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.894261 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4" (OuterVolumeSpecName: "kube-api-access-x96j4") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "kube-api-access-x96j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.932061 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.980960 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data" (OuterVolumeSpecName: "config-data") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990754 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990788 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990802 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990816 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990828 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.994444 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.092067 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.491939 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" exitCode=0 Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492014 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98"} Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c"} Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492117 4795 scope.go:117] "RemoveContainer" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.542290 4795 scope.go:117] "RemoveContainer" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.548910 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.558276 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.578870 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579278 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579328 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579337 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579365 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579374 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579395 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579597 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579655 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.581580 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.587132 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.613931 4795 scope.go:117] "RemoveContainer" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.613972 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.614298 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.650417 4795 scope.go:117] "RemoveContainer" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.688859 4795 scope.go:117] "RemoveContainer" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.691100 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c\": container with ID starting with 2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c not found: ID does not exist" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691149 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c"} err="failed to get container status \"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c\": rpc error: code = NotFound desc = could not find container \"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c\": container with ID starting with 2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691174 4795 scope.go:117] "RemoveContainer" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.691531 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2\": container with ID starting with fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2 not found: ID does not exist" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691582 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2"} err="failed to get container status \"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2\": rpc error: code = NotFound desc = could not find container \"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2\": container with ID starting with fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2 not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691602 4795 scope.go:117] "RemoveContainer" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.691994 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c\": container with ID starting with 1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c not found: ID does not exist" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.692012 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c"} err="failed to get container status \"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c\": rpc error: code = NotFound desc = could not find container \"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c\": container with ID starting with 1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.692026 4795 scope.go:117] "RemoveContainer" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.692245 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98\": container with ID starting with e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98 not found: ID does not exist" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.692259 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98"} err="failed to get container status \"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98\": rpc error: code = NotFound desc = could not find container \"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98\": container with ID starting with e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98 not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715179 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817237 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.818885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.818967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.822301 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.822402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.823213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.823559 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.833790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.897575 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.949792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.020013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.020464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.021205 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.021355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.025814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts" (OuterVolumeSpecName: "scripts") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.026060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm" (OuterVolumeSpecName: "kube-api-access-vfcrm") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "kube-api-access-vfcrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.055081 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data" (OuterVolumeSpecName: "config-data") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.064769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124891 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124926 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124935 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124944 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.263250 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" path="/var/lib/kubelet/pods/7e54cb1b-1f35-4344-899a-395d140ac8c3/volumes" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.391918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.503487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerDied","Data":"3c4ecc47c641ea00ae26b045f58c4b097551f488bd93d478e0c38b26018a528e"} Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.503545 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c4ecc47c641ea00ae26b045f58c4b097551f488bd93d478e0c38b26018a528e" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.503541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.506756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"39c11fccd9e673059022bf047af401ca209830155fcf251e8d72aeeb8fa6e0d2"} Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.593893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:39:51 crc kubenswrapper[4795]: E0320 17:39:51.594494 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" containerName="nova-cell0-conductor-db-sync" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.594516 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" containerName="nova-cell0-conductor-db-sync" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.594766 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" containerName="nova-cell0-conductor-db-sync" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.595503 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.597055 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.597653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-c5nq5" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.620413 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.736180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.736281 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9qgc\" (UniqueName: \"kubernetes.io/projected/5916e4d2-2863-4088-be97-cf368906820b-kube-api-access-r9qgc\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.736307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.837525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9qgc\" (UniqueName: \"kubernetes.io/projected/5916e4d2-2863-4088-be97-cf368906820b-kube-api-access-r9qgc\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.837564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.837659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.842719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.851744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.863663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9qgc\" (UniqueName: \"kubernetes.io/projected/5916e4d2-2863-4088-be97-cf368906820b-kube-api-access-r9qgc\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.918717 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.164994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:39:52 crc kubenswrapper[4795]: W0320 17:39:52.171105 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5916e4d2_2863_4088_be97_cf368906820b.slice/crio-2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500 WatchSource:0}: Error finding container 2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500: Status 404 returned error can't find the container with id 2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500 Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.518863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5916e4d2-2863-4088-be97-cf368906820b","Type":"ContainerStarted","Data":"b21e2cbeacbd4128217704e7ce8b39085fc6ffcb9e96a06db0805f198d443a18"} Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.519277 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5916e4d2-2863-4088-be97-cf368906820b","Type":"ContainerStarted","Data":"2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500"} Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.519306 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.561788 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.561767744 podStartE2EDuration="1.561767744s" podCreationTimestamp="2026-03-20 17:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:52.536761308 +0000 UTC m=+1335.994792889" watchObservedRunningTime="2026-03-20 17:39:52.561767744 +0000 UTC m=+1336.019799305" Mar 20 17:39:53 crc kubenswrapper[4795]: I0320 17:39:53.547219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1"} Mar 20 17:39:55 crc kubenswrapper[4795]: I0320 17:39:55.580498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc"} Mar 20 17:39:55 crc kubenswrapper[4795]: I0320 17:39:55.581299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c"} Mar 20 17:39:58 crc kubenswrapper[4795]: I0320 17:39:58.613534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f"} Mar 20 17:39:58 crc kubenswrapper[4795]: I0320 17:39:58.614382 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:58 crc kubenswrapper[4795]: I0320 17:39:58.668825 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.481911165 podStartE2EDuration="8.668801292s" podCreationTimestamp="2026-03-20 17:39:50 +0000 UTC" firstStartedPulling="2026-03-20 17:39:51.399063296 +0000 UTC m=+1334.857094847" lastFinishedPulling="2026-03-20 17:39:57.585953393 +0000 UTC m=+1341.043984974" observedRunningTime="2026-03-20 17:39:58.647077739 +0000 UTC m=+1342.105109290" watchObservedRunningTime="2026-03-20 17:39:58.668801292 +0000 UTC m=+1342.126832833" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.177019 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.178867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.181200 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.181463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.181520 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.187808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.353636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"auto-csr-approver-29567140-s5wtb\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.456101 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"auto-csr-approver-29567140-s5wtb\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.484566 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"auto-csr-approver-29567140-s5wtb\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.503555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.980955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:40:00 crc kubenswrapper[4795]: W0320 17:40:00.992939 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e6fe9e_d22e_420c_b050_a00a53749f1f.slice/crio-69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f WatchSource:0}: Error finding container 69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f: Status 404 returned error can't find the container with id 69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f Mar 20 17:40:01 crc kubenswrapper[4795]: I0320 17:40:01.644208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" event={"ID":"f9e6fe9e-d22e-420c-b050-a00a53749f1f","Type":"ContainerStarted","Data":"69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f"} Mar 20 17:40:01 crc kubenswrapper[4795]: I0320 17:40:01.946500 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.449470 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.451044 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.453278 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.453584 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.461202 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.718990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.723362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.723622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.764264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.769569 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.771041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.774128 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.786141 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.787197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.859053 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.860214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.863192 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.867039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.880558 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.883214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.886068 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.944149 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.010263 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.012003 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016639 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016768 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.020506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.020645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.024595 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.024651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.042148 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.058007 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.082547 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.084476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.100031 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118335 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118567 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.123074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.123125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.128101 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.133092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.146663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.147141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.153591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.197501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220639 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.222240 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.227551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.230923 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.241804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.323449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.324047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.324125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.324532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.325037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.345642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.357339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.357451 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.403447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.561081 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:40:03 crc kubenswrapper[4795]: W0320 17:40:03.603746 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18794d5c_e43a_44dc_9510_763a31275104.slice/crio-221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf WatchSource:0}: Error finding container 221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf: Status 404 returned error can't find the container with id 221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.663513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerStarted","Data":"221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf"} Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.665088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" event={"ID":"f9e6fe9e-d22e-420c-b050-a00a53749f1f","Type":"ContainerDied","Data":"8ac90f4263985d5a19d6f00ac01d70eb81681c5a298c4e2c5302052e573286a6"} Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.665157 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerID="8ac90f4263985d5a19d6f00ac01d70eb81681c5a298c4e2c5302052e573286a6" exitCode=0 Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.714533 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.856816 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.879017 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.880224 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.882400 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.883885 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.902012 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.006182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:04 crc kubenswrapper[4795]: W0320 17:40:04.019323 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d499f64_fbe0_4f89_af22_619a306e7857.slice/crio-033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0 WatchSource:0}: Error finding container 033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0: Status 404 returned error can't find the container with id 033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0 Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.026180 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:04 crc kubenswrapper[4795]: W0320 17:40:04.026715 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74d437e5_b643_4a6f_a9d9_50cf8166d0af.slice/crio-a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5 WatchSource:0}: Error finding container a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5: Status 404 returned error can't find the container with id a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5 Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.036061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.038896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.038982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.039010 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.039081 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.144829 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.145277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.148090 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.161164 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.200924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.667489 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.720252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerStarted","Data":"ab84cd82bd3f42e54f99be97caf2de2ed929e55f48d616683e9c8f126716e9cd"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.729649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerStarted","Data":"a81953c7eff097e8b8de2cddd252282ad6966a7afa286fafb4eb334123de90a3"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.732237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerStarted","Data":"c016e3406d811f115716e682934de46c5a980e71042e6b56bc4a4a96322456f5"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.739463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerStarted","Data":"2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.740970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerStarted","Data":"a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.745929 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d499f64-fbe0-4f89-af22-619a306e7857" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" exitCode=0 Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.746518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerDied","Data":"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.746543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerStarted","Data":"033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.769735 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kmgk9" podStartSLOduration=2.769718817 podStartE2EDuration="2.769718817s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:04.760194968 +0000 UTC m=+1348.218226509" watchObservedRunningTime="2026-03-20 17:40:04.769718817 +0000 UTC m=+1348.227750358" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.139416 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.267824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.272742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck" (OuterVolumeSpecName: "kube-api-access-52dck") pod "f9e6fe9e-d22e-420c-b050-a00a53749f1f" (UID: "f9e6fe9e-d22e-420c-b050-a00a53749f1f"). InnerVolumeSpecName "kube-api-access-52dck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.370439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.756096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerStarted","Data":"de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.756217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerStarted","Data":"be403fd93662530286fc5651363ea195fe87d753696088488bd47663f1933769"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.760045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" event={"ID":"f9e6fe9e-d22e-420c-b050-a00a53749f1f","Type":"ContainerDied","Data":"69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.760076 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.760120 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.763517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerStarted","Data":"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.763583 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.775369 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" podStartSLOduration=2.7753529500000003 podStartE2EDuration="2.77535295s" podCreationTimestamp="2026-03-20 17:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:05.7728008 +0000 UTC m=+1349.230832351" watchObservedRunningTime="2026-03-20 17:40:05.77535295 +0000 UTC m=+1349.233384491" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.799733 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" podStartSLOduration=2.799714836 podStartE2EDuration="2.799714836s" podCreationTimestamp="2026-03-20 17:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:05.797149815 +0000 UTC m=+1349.255181356" watchObservedRunningTime="2026-03-20 17:40:05.799714836 +0000 UTC m=+1349.257746377" Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.207299 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.214171 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.475536 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.508651 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.269618 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" path="/var/lib/kubelet/pods/1462264f-6c8a-4024-9465-3e7d2908ba24/volumes" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.779811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerStarted","Data":"65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.779931 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7" gracePeriod=30 Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerStarted","Data":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerStarted","Data":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782821 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" containerID="cri-o://989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" gracePeriod=30 Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782874 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" containerID="cri-o://31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" gracePeriod=30 Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.786022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerStarted","Data":"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.788117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerStarted","Data":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.788142 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerStarted","Data":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.805352 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.971128764 podStartE2EDuration="5.805315083s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:04.028315855 +0000 UTC m=+1347.486347396" lastFinishedPulling="2026-03-20 17:40:06.862502174 +0000 UTC m=+1350.320533715" observedRunningTime="2026-03-20 17:40:07.796014381 +0000 UTC m=+1351.254045942" watchObservedRunningTime="2026-03-20 17:40:07.805315083 +0000 UTC m=+1351.263346634" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.831958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.691181171 podStartE2EDuration="5.831943451s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:03.721250289 +0000 UTC m=+1347.179281820" lastFinishedPulling="2026-03-20 17:40:06.862012569 +0000 UTC m=+1350.320044100" observedRunningTime="2026-03-20 17:40:07.819469148 +0000 UTC m=+1351.277500699" watchObservedRunningTime="2026-03-20 17:40:07.831943451 +0000 UTC m=+1351.289974992" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.852910 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.00154009 podStartE2EDuration="5.852844247s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:04.014824102 +0000 UTC m=+1347.472855643" lastFinishedPulling="2026-03-20 17:40:06.866128259 +0000 UTC m=+1350.324159800" observedRunningTime="2026-03-20 17:40:07.842602455 +0000 UTC m=+1351.300634016" watchObservedRunningTime="2026-03-20 17:40:07.852844247 +0000 UTC m=+1351.310875788" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.874457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8924514439999998 podStartE2EDuration="5.874436515s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:03.878796109 +0000 UTC m=+1347.336827650" lastFinishedPulling="2026-03-20 17:40:06.86078118 +0000 UTC m=+1350.318812721" observedRunningTime="2026-03-20 17:40:07.860750815 +0000 UTC m=+1351.318782366" watchObservedRunningTime="2026-03-20 17:40:07.874436515 +0000 UTC m=+1351.332468066" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.199031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.357921 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.387524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.530865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531183 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs" (OuterVolumeSpecName: "logs") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.537097 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6" (OuterVolumeSpecName: "kube-api-access-cjvw6") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "kube-api-access-cjvw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.567818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data" (OuterVolumeSpecName: "config-data") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.577769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634555 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634598 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634614 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634627 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826935 4795 generic.go:334] "Generic (PLEG): container finished" podID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" exitCode=0 Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826975 4795 generic.go:334] "Generic (PLEG): container finished" podID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" exitCode=143 Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826988 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerDied","Data":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.827488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerDied","Data":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.827613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerDied","Data":"ab84cd82bd3f42e54f99be97caf2de2ed929e55f48d616683e9c8f126716e9cd"} Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.827538 4795 scope.go:117] "RemoveContainer" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.847563 4795 scope.go:117] "RemoveContainer" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.869456 4795 scope.go:117] "RemoveContainer" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.880992 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.881298 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": container with ID starting with 31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893 not found: ID does not exist" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.881380 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} err="failed to get container status \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": rpc error: code = NotFound desc = could not find container \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": container with ID starting with 31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.881435 4795 scope.go:117] "RemoveContainer" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.882107 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": container with ID starting with 989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806 not found: ID does not exist" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.882282 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} err="failed to get container status \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": rpc error: code = NotFound desc = could not find container \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": container with ID starting with 989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.882485 4795 scope.go:117] "RemoveContainer" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.894759 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.895271 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} err="failed to get container status \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": rpc error: code = NotFound desc = could not find container \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": container with ID starting with 31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.895484 4795 scope.go:117] "RemoveContainer" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.896243 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} err="failed to get container status \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": rpc error: code = NotFound desc = could not find container \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": container with ID starting with 989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.907732 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.908285 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.908395 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.908497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerName="oc" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.908578 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerName="oc" Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.908662 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.908755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.909044 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerName="oc" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.909130 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.909246 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.910542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.915487 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.915734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.917123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049122 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049211 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.155368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.161175 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.161249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.181048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.264347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.273289 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" path="/var/lib/kubelet/pods/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950/volumes" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.719078 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.839031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerStarted","Data":"c172225cd88a41d11bff28076c479195964bb7b3c9cd3656f88a39cdd6d2adba"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.849395 4795 generic.go:334] "Generic (PLEG): container finished" podID="18794d5c-e43a-44dc-9510-763a31275104" containerID="2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1" exitCode=0 Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.849476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerDied","Data":"2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.852531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerStarted","Data":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.852574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerStarted","Data":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.891126 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.891106126 podStartE2EDuration="2.891106126s" podCreationTimestamp="2026-03-20 17:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:10.889526796 +0000 UTC m=+1354.347558357" watchObservedRunningTime="2026-03-20 17:40:10.891106126 +0000 UTC m=+1354.349137667" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.251271 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.434535 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq" (OuterVolumeSpecName: "kube-api-access-fh9qq") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "kube-api-access-fh9qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.434535 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts" (OuterVolumeSpecName: "scripts") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.472105 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data" (OuterVolumeSpecName: "config-data") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.474591 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524010 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524062 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524082 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524102 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.881567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerDied","Data":"221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf"} Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.881639 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.881813 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.888750 4795 generic.go:334] "Generic (PLEG): container finished" podID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerID="de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c" exitCode=0 Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.888756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerDied","Data":"de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.124850 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.125361 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" containerID="cri-o://753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.125434 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" containerID="cri-o://50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.140442 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.140677 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" containerID="cri-o://87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.164304 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.164558 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" containerID="cri-o://a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.164657 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" containerID="cri-o://b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.405090 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.489804 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.697838 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.703814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856642 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs" (OuterVolumeSpecName: "logs") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856666 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856850 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs" (OuterVolumeSpecName: "logs") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.857151 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.857169 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.862078 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2" (OuterVolumeSpecName: "kube-api-access-j9vf2") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "kube-api-access-j9vf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.866955 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw" (OuterVolumeSpecName: "kube-api-access-d2dkw") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "kube-api-access-d2dkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.888182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data" (OuterVolumeSpecName: "config-data") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.894874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.901189 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data" (OuterVolumeSpecName: "config-data") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.903273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919311 4795 generic.go:334] "Generic (PLEG): container finished" podID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" exitCode=0 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919336 4795 generic.go:334] "Generic (PLEG): container finished" podID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" exitCode=143 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919425 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerDied","Data":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.920023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerDied","Data":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.920048 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerDied","Data":"c016e3406d811f115716e682934de46c5a980e71042e6b56bc4a4a96322456f5"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.920065 4795 scope.go:117] "RemoveContainer" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921183 4795 generic.go:334] "Generic (PLEG): container finished" podID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" exitCode=0 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921212 4795 generic.go:334] "Generic (PLEG): container finished" podID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" exitCode=143 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921227 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerDied","Data":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerDied","Data":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerDied","Data":"c172225cd88a41d11bff28076c479195964bb7b3c9cd3656f88a39cdd6d2adba"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921698 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" containerID="cri-o://154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" gracePeriod=10 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.926741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.949889 4795 scope.go:117] "RemoveContainer" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.950081 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967599 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967637 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967650 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967661 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967671 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967681 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967715 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.011061 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.027162 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.028062 4795 scope.go:117] "RemoveContainer" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.030240 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": container with ID starting with 50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49 not found: ID does not exist" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.030313 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} err="failed to get container status \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": rpc error: code = NotFound desc = could not find container \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": container with ID starting with 50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.030365 4795 scope.go:117] "RemoveContainer" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.034462 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": container with ID starting with 753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37 not found: ID does not exist" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.034514 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} err="failed to get container status \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": rpc error: code = NotFound desc = could not find container \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": container with ID starting with 753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.034548 4795 scope.go:117] "RemoveContainer" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035039 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} err="failed to get container status \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": rpc error: code = NotFound desc = could not find container \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": container with ID starting with 50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035073 4795 scope.go:117] "RemoveContainer" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035480 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} err="failed to get container status \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": rpc error: code = NotFound desc = could not find container \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": container with ID starting with 753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035503 4795 scope.go:117] "RemoveContainer" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038727 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038742 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038763 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038771 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038800 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038806 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038824 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18794d5c-e43a-44dc-9510-763a31275104" containerName="nova-manage" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038829 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="18794d5c-e43a-44dc-9510-763a31275104" containerName="nova-manage" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038837 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038842 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039012 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039026 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039034 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039043 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="18794d5c-e43a-44dc-9510-763a31275104" containerName="nova-manage" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039051 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.045576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.050480 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.092529 4795 scope.go:117] "RemoveContainer" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.121259 4795 scope.go:117] "RemoveContainer" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.123538 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": container with ID starting with b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93 not found: ID does not exist" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.123593 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} err="failed to get container status \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": rpc error: code = NotFound desc = could not find container \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": container with ID starting with b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.123620 4795 scope.go:117] "RemoveContainer" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.124218 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": container with ID starting with a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968 not found: ID does not exist" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124254 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} err="failed to get container status \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": rpc error: code = NotFound desc = could not find container \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": container with ID starting with a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124277 4795 scope.go:117] "RemoveContainer" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124643 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} err="failed to get container status \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": rpc error: code = NotFound desc = could not find container \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": container with ID starting with b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124679 4795 scope.go:117] "RemoveContainer" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124911 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} err="failed to get container status \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": rpc error: code = NotFound desc = could not find container \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": container with ID starting with a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.171997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.172042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.172073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.172114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.277759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.282286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.283017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.292134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.357560 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.357785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.361735 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.378006 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.388952 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.389431 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerName="nova-cell1-conductor-db-sync" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.389444 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerName="nova-cell1-conductor-db-sync" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.389600 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerName="nova-cell1-conductor-db-sync" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395123 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395562 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.406564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.409178 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.410815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts" (OuterVolumeSpecName: "scripts") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.411033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.411456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497103 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.500804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.512068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.515960 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65" (OuterVolumeSpecName: "kube-api-access-cvf65") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "kube-api-access-cvf65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.516463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.522324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.543890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.546198 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data" (OuterVolumeSpecName: "config-data") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.576465 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.586223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.605698 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.605731 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.605742 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710140 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710169 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.720872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w" (OuterVolumeSpecName: "kube-api-access-92c6w") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "kube-api-access-92c6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.756889 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.782215 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config" (OuterVolumeSpecName: "config") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.784615 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.796634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.801376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812588 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812634 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812645 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812655 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812665 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812675 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.818808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.873893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.913543 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"b2f35863-4f45-43d5-b600-9028b32195d7\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.913715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"b2f35863-4f45-43d5-b600-9028b32195d7\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.913845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"b2f35863-4f45-43d5-b600-9028b32195d7\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.917878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt" (OuterVolumeSpecName: "kube-api-access-sbkkt") pod "b2f35863-4f45-43d5-b600-9028b32195d7" (UID: "b2f35863-4f45-43d5-b600-9028b32195d7"). InnerVolumeSpecName "kube-api-access-sbkkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.940455 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerStarted","Data":"40ebeaf246d0901fcc00ff42264c8849595abc6aa664bf61e7a95863c72633fd"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.941203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2f35863-4f45-43d5-b600-9028b32195d7" (UID: "b2f35863-4f45-43d5-b600-9028b32195d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.941959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerDied","Data":"be403fd93662530286fc5651363ea195fe87d753696088488bd47663f1933769"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.942001 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be403fd93662530286fc5651363ea195fe87d753696088488bd47663f1933769" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.942074 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.949076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data" (OuterVolumeSpecName: "config-data") pod "b2f35863-4f45-43d5-b600-9028b32195d7" (UID: "b2f35863-4f45-43d5-b600-9028b32195d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952550 4795 generic.go:334] "Generic (PLEG): container finished" podID="b2f35863-4f45-43d5-b600-9028b32195d7" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" exitCode=0 Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952644 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerDied","Data":"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerDied","Data":"a81953c7eff097e8b8de2cddd252282ad6966a7afa286fafb4eb334123de90a3"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952785 4795 scope.go:117] "RemoveContainer" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958210 4795 generic.go:334] "Generic (PLEG): container finished" podID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" exitCode=0 Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerDied","Data":"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958318 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerDied","Data":"d7ccdefaaa93b0e48b444bfb331ca4591ab4806568e7a9f1ee5df6eaa4ff29c6"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.000128 4795 scope.go:117] "RemoveContainer" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.000575 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517\": container with ID starting with 87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517 not found: ID does not exist" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.000609 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517"} err="failed to get container status \"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517\": rpc error: code = NotFound desc = could not find container \"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517\": container with ID starting with 87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517 not found: ID does not exist" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.000635 4795 scope.go:117] "RemoveContainer" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016199 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016224 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016233 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016779 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.017269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017290 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.017317 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="init" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017327 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="init" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.017353 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017363 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017592 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017618 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.018375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.037754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.039017 4795 scope.go:117] "RemoveContainer" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.039694 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.050400 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.068637 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.101431 4795 scope.go:117] "RemoveContainer" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.102343 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93\": container with ID starting with 154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93 not found: ID does not exist" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.102373 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93"} err="failed to get container status \"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93\": rpc error: code = NotFound desc = could not find container \"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93\": container with ID starting with 154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93 not found: ID does not exist" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.102398 4795 scope.go:117] "RemoveContainer" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.103177 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a\": container with ID starting with 7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a not found: ID does not exist" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.103202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a"} err="failed to get container status \"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a\": rpc error: code = NotFound desc = could not find container \"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a\": container with ID starting with 7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a not found: ID does not exist" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.106751 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.119793 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.119846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28wdn\" (UniqueName: \"kubernetes.io/projected/19c15c93-572c-4d53-b924-172f3ad29c8a-kube-api-access-28wdn\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.119982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.130762 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.131947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.135085 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.142308 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.163747 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.177934 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28wdn\" (UniqueName: \"kubernetes.io/projected/19c15c93-572c-4d53-b924-172f3ad29c8a-kube-api-access-28wdn\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221663 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221806 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.225332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.228325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.241257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28wdn\" (UniqueName: \"kubernetes.io/projected/19c15c93-572c-4d53-b924-172f3ad29c8a-kube-api-access-28wdn\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.264677 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" path="/var/lib/kubelet/pods/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.266202 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" path="/var/lib/kubelet/pods/b2f35863-4f45-43d5-b600-9028b32195d7/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.266947 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" path="/var/lib/kubelet/pods/da0f84b3-294d-455f-89e7-1c8f8439a837/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.267663 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" path="/var/lib/kubelet/pods/f0082b8a-cf10-4449-a93f-b0c79e10e2d0/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.324096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.324276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.324364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.330130 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.330518 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.341232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.367856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.468659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.793281 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.931186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: W0320 17:40:15.933050 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef24c878_faa8_4a0b_a303_951d0a457eef.slice/crio-84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c WatchSource:0}: Error finding container 84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c: Status 404 returned error can't find the container with id 84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.979469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerStarted","Data":"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.979508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerStarted","Data":"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.979518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerStarted","Data":"468acb8849abdd77a144584691b96b8cbadeae923d66dd538f230c0aee8d52cb"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.982257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19c15c93-572c-4d53-b924-172f3ad29c8a","Type":"ContainerStarted","Data":"2d7a81c2145eeccbda9da96a1c6112a931d61cce32b35c2cb50ce493055e207b"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.984699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerStarted","Data":"84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.987113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerStarted","Data":"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.987138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerStarted","Data":"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.994907 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.994889277 podStartE2EDuration="1.994889277s" podCreationTimestamp="2026-03-20 17:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:15.992964725 +0000 UTC m=+1359.450996286" watchObservedRunningTime="2026-03-20 17:40:15.994889277 +0000 UTC m=+1359.452920828" Mar 20 17:40:16 crc kubenswrapper[4795]: I0320 17:40:16.022510 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.022491244 podStartE2EDuration="2.022491244s" podCreationTimestamp="2026-03-20 17:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:16.010721094 +0000 UTC m=+1359.468752655" watchObservedRunningTime="2026-03-20 17:40:16.022491244 +0000 UTC m=+1359.480522805" Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.000966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19c15c93-572c-4d53-b924-172f3ad29c8a","Type":"ContainerStarted","Data":"50aadf2db6761032903224e02e2fdd62cf9b757415bac79531147b03ed93db54"} Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.001189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.003217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerStarted","Data":"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463"} Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.028170 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.02815522 podStartE2EDuration="3.02815522s" podCreationTimestamp="2026-03-20 17:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:17.026121937 +0000 UTC m=+1360.484153508" watchObservedRunningTime="2026-03-20 17:40:17.02815522 +0000 UTC m=+1360.486186761" Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.049410 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.049394538 podStartE2EDuration="2.049394538s" podCreationTimestamp="2026-03-20 17:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:17.046894722 +0000 UTC m=+1360.504926263" watchObservedRunningTime="2026-03-20 17:40:17.049394538 +0000 UTC m=+1360.507426079" Mar 20 17:40:20 crc kubenswrapper[4795]: I0320 17:40:20.470133 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:40:20 crc kubenswrapper[4795]: I0320 17:40:20.963161 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.359223 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.359554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.496845 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.497127 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" containerID="cri-o://19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" gracePeriod=30 Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.586560 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.586871 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.035823 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105779 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" exitCode=2 Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerDied","Data":"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802"} Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerDied","Data":"715534f72ece852c083764840657cce952ec7708ddcedcd00af2caddc251418f"} Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105861 4795 scope.go:117] "RemoveContainer" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.106013 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.137371 4795 scope.go:117] "RemoveContainer" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" Mar 20 17:40:25 crc kubenswrapper[4795]: E0320 17:40:25.137852 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802\": container with ID starting with 19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802 not found: ID does not exist" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.137883 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802"} err="failed to get container status \"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802\": rpc error: code = NotFound desc = could not find container \"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802\": container with ID starting with 19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802 not found: ID does not exist" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.224442 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.252969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp" (OuterVolumeSpecName: "kube-api-access-5k6xp") pod "5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" (UID: "5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8"). InnerVolumeSpecName "kube-api-access-5k6xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.327436 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.396242 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.400283 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.441664 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.469868 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.470492 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.479458 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.506239 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: E0320 17:40:25.506613 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.506624 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.506787 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.507324 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.511824 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.511997 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.513482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.529729 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.596864 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.596899 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwkj\" (UniqueName: \"kubernetes.io/projected/72605c7d-99df-450f-900b-3022b0520149-kube-api-access-jlwkj\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.733503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwkj\" (UniqueName: \"kubernetes.io/projected/72605c7d-99df-450f-900b-3022b0520149-kube-api-access-jlwkj\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.733558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.733626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.734343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.747856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.747920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.748129 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.753548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwkj\" (UniqueName: \"kubernetes.io/projected/72605c7d-99df-450f-900b-3022b0520149-kube-api-access-jlwkj\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.824607 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.169302 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.294183 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:26 crc kubenswrapper[4795]: W0320 17:40:26.294911 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72605c7d_99df_450f_900b_3022b0520149.slice/crio-6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8 WatchSource:0}: Error finding container 6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8: Status 404 returned error can't find the container with id 6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478038 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478306 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" containerID="cri-o://ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" gracePeriod=30 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478365 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" containerID="cri-o://1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" gracePeriod=30 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478411 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" containerID="cri-o://a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" gracePeriod=30 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478411 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" containerID="cri-o://f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" gracePeriod=30 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149459 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" exitCode=0 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149769 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" exitCode=2 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149784 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" exitCode=0 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.151628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72605c7d-99df-450f-900b-3022b0520149","Type":"ContainerStarted","Data":"ded6295ee3fa941441f90f6f2bd86fc156736a39b08b900c51df152156208d21"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.151675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72605c7d-99df-450f-900b-3022b0520149","Type":"ContainerStarted","Data":"6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.171039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.808220489 podStartE2EDuration="2.171018731s" podCreationTimestamp="2026-03-20 17:40:25 +0000 UTC" firstStartedPulling="2026-03-20 17:40:26.297084279 +0000 UTC m=+1369.755115820" lastFinishedPulling="2026-03-20 17:40:26.659882521 +0000 UTC m=+1370.117914062" observedRunningTime="2026-03-20 17:40:27.166022449 +0000 UTC m=+1370.624053990" watchObservedRunningTime="2026-03-20 17:40:27.171018731 +0000 UTC m=+1370.629050282" Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.269342 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" path="/var/lib/kubelet/pods/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8/volumes" Mar 20 17:40:28 crc kubenswrapper[4795]: I0320 17:40:28.160741 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.090111 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161469 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161775 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161807 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.162885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.162419 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.170953 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts" (OuterVolumeSpecName: "scripts") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.170989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z" (OuterVolumeSpecName: "kube-api-access-bll6z") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "kube-api-access-bll6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.201960 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208668 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c"} Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208681 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" exitCode=0 Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"39c11fccd9e673059022bf047af401ca209830155fcf251e8d72aeeb8fa6e0d2"} Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208876 4795 scope.go:117] "RemoveContainer" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.242094 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.263556 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.264447 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265308 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265386 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265455 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265510 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.271405 4795 scope.go:117] "RemoveContainer" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.280235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data" (OuterVolumeSpecName: "config-data") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.291259 4795 scope.go:117] "RemoveContainer" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.311065 4795 scope.go:117] "RemoveContainer" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.328536 4795 scope.go:117] "RemoveContainer" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.328936 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f\": container with ID starting with 1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f not found: ID does not exist" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.328980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f"} err="failed to get container status \"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f\": rpc error: code = NotFound desc = could not find container \"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f\": container with ID starting with 1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329005 4795 scope.go:117] "RemoveContainer" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.329337 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc\": container with ID starting with a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc not found: ID does not exist" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329367 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc"} err="failed to get container status \"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc\": rpc error: code = NotFound desc = could not find container \"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc\": container with ID starting with a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329388 4795 scope.go:117] "RemoveContainer" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.329640 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c\": container with ID starting with f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c not found: ID does not exist" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329664 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c"} err="failed to get container status \"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c\": rpc error: code = NotFound desc = could not find container \"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c\": container with ID starting with f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329680 4795 scope.go:117] "RemoveContainer" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.329901 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1\": container with ID starting with ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1 not found: ID does not exist" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329924 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1"} err="failed to get container status \"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1\": rpc error: code = NotFound desc = could not find container \"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1\": container with ID starting with ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1 not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.358639 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.358704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.367321 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.547945 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.557580 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.585723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586227 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586243 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586275 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586286 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586307 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586316 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586329 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586337 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586548 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586568 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586579 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586600 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.588970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.590292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.591365 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.593101 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.640826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.641031 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.641029 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.672815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.672865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.672951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673484 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.775878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.775946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.781392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.781408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.781768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.782148 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.783310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.799917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.944513 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:33 crc kubenswrapper[4795]: I0320 17:40:33.262335 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb437f62-80bf-465d-85cf-12348aba1514" path="/var/lib/kubelet/pods/eb437f62-80bf-465d-85cf-12348aba1514/volumes" Mar 20 17:40:33 crc kubenswrapper[4795]: I0320 17:40:33.402454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:33 crc kubenswrapper[4795]: W0320 17:40:33.409665 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15f36e1_3fd7_43bc_9aaa_d793c6a43fd0.slice/crio-a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe WatchSource:0}: Error finding container a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe: Status 404 returned error can't find the container with id a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.232335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe"} Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.365012 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.365826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.375779 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.591491 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.600469 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.606378 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.263970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.264243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.264264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.273301 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.478964 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.488493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.499495 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.633584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.634339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.635203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.636019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.637346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.660039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.820155 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.843964 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:40:36 crc kubenswrapper[4795]: I0320 17:40:36.267899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} Mar 20 17:40:36 crc kubenswrapper[4795]: I0320 17:40:36.380278 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.308702 4795 generic.go:334] "Generic (PLEG): container finished" podID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" exitCode=0 Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.308906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerDied","Data":"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087"} Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.309847 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerStarted","Data":"54687987edb9f2765e8d4f7b8bfef3664f1024d2c67848e40765c69ff1c22cea"} Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.608835 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.901013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.328455 4795 generic.go:334] "Generic (PLEG): container finished" podID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerID="65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7" exitCode=137 Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.328494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerDied","Data":"65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7"} Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.330332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerStarted","Data":"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8"} Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.330507 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" containerID="cri-o://9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" gracePeriod=30 Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.330579 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" containerID="cri-o://0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" gracePeriod=30 Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.364621 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" podStartSLOduration=3.36459957 podStartE2EDuration="3.36459957s" podCreationTimestamp="2026-03-20 17:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:38.360737023 +0000 UTC m=+1381.818768584" watchObservedRunningTime="2026-03-20 17:40:38.36459957 +0000 UTC m=+1381.822631111" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.386114 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.586173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.586480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.586683 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.591427 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr" (OuterVolumeSpecName: "kube-api-access-xbfmr") pod "74d437e5-b643-4a6f-a9d9-50cf8166d0af" (UID: "74d437e5-b643-4a6f-a9d9-50cf8166d0af"). InnerVolumeSpecName "kube-api-access-xbfmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.629616 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data" (OuterVolumeSpecName: "config-data") pod "74d437e5-b643-4a6f-a9d9-50cf8166d0af" (UID: "74d437e5-b643-4a6f-a9d9-50cf8166d0af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.648748 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74d437e5-b643-4a6f-a9d9-50cf8166d0af" (UID: "74d437e5-b643-4a6f-a9d9-50cf8166d0af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.688910 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.688945 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.688955 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.339460 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff123956-68b6-4a60-ac22-1972b9554205" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" exitCode=143 Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.339525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerDied","Data":"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843"} Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerDied","Data":"a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5"} Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341187 4795 scope.go:117] "RemoveContainer" containerID="65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.367479 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.390304 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.406569 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: E0320 17:40:39.407119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.407144 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.407498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.408523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.414334 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.414428 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.414689 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.417095 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.606955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607281 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qhs\" (UniqueName: \"kubernetes.io/projected/d2a5e398-6d25-43b1-8c29-407af2d9348b-kube-api-access-g5qhs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qhs\" (UniqueName: \"kubernetes.io/projected/d2a5e398-6d25-43b1-8c29-407af2d9348b-kube-api-access-g5qhs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.713579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.713589 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.713742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.718147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.729985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qhs\" (UniqueName: \"kubernetes.io/projected/d2a5e398-6d25-43b1-8c29-407af2d9348b-kube-api-access-g5qhs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.731670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:40 crc kubenswrapper[4795]: I0320 17:40:40.211515 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:40 crc kubenswrapper[4795]: I0320 17:40:40.357601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d2a5e398-6d25-43b1-8c29-407af2d9348b","Type":"ContainerStarted","Data":"95807406a254da5fe2ae43dc10f1b3b1a4186d1ac881af24446bccfb5750f60d"} Mar 20 17:40:41 crc kubenswrapper[4795]: I0320 17:40:41.265184 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" path="/var/lib/kubelet/pods/74d437e5-b643-4a6f-a9d9-50cf8166d0af/volumes" Mar 20 17:40:41 crc kubenswrapper[4795]: I0320 17:40:41.427105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d2a5e398-6d25-43b1-8c29-407af2d9348b","Type":"ContainerStarted","Data":"1360ebb15bfb90c2fbb3a893d08d00218917f150a6ae9f359d41fcd3ab50f8b6"} Mar 20 17:40:41 crc kubenswrapper[4795]: I0320 17:40:41.473100 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.473087252 podStartE2EDuration="2.473087252s" podCreationTimestamp="2026-03-20 17:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:41.468083699 +0000 UTC m=+1384.926115230" watchObservedRunningTime="2026-03-20 17:40:41.473087252 +0000 UTC m=+1384.931118793" Mar 20 17:40:41 crc kubenswrapper[4795]: E0320 17:40:41.745995 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff123956_68b6_4a60_ac22_1972b9554205.slice/crio-0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff123956_68b6_4a60_ac22_1972b9554205.slice/crio-conmon-0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.014553 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.160645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.160734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.160804 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.161034 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.161878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs" (OuterVolumeSpecName: "logs") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.167235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp" (OuterVolumeSpecName: "kube-api-access-86gkp") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "kube-api-access-86gkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.195792 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data" (OuterVolumeSpecName: "config-data") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.210528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263121 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263144 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263155 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263164 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.437773 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff123956-68b6-4a60-ac22-1972b9554205" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" exitCode=0 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.437855 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.437867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerDied","Data":"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5"} Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.438224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerDied","Data":"40ebeaf246d0901fcc00ff42264c8849595abc6aa664bf61e7a95863c72633fd"} Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.438242 4795 scope.go:117] "RemoveContainer" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.440977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441144 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" containerID="cri-o://c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441149 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" containerID="cri-o://02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441210 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" containerID="cri-o://fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441244 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" containerID="cri-o://becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.467584 4795 scope.go:117] "RemoveContainer" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.483999 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.262729271 podStartE2EDuration="10.483977345s" podCreationTimestamp="2026-03-20 17:40:32 +0000 UTC" firstStartedPulling="2026-03-20 17:40:33.412033367 +0000 UTC m=+1376.870064908" lastFinishedPulling="2026-03-20 17:40:41.633281441 +0000 UTC m=+1385.091312982" observedRunningTime="2026-03-20 17:40:42.464767168 +0000 UTC m=+1385.922798709" watchObservedRunningTime="2026-03-20 17:40:42.483977345 +0000 UTC m=+1385.942008886" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.487433 4795 scope.go:117] "RemoveContainer" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.487885 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5\": container with ID starting with 0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5 not found: ID does not exist" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.487924 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5"} err="failed to get container status \"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5\": rpc error: code = NotFound desc = could not find container \"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5\": container with ID starting with 0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5 not found: ID does not exist" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.487951 4795 scope.go:117] "RemoveContainer" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.488235 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843\": container with ID starting with 9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843 not found: ID does not exist" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.488284 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843"} err="failed to get container status \"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843\": rpc error: code = NotFound desc = could not find container \"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843\": container with ID starting with 9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843 not found: ID does not exist" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.495491 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.503848 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511008 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.511481 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511504 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.511526 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511533 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511803 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511825 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.512878 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.516202 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.517199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.517939 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.518297 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670170 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772579 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.773182 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.778057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.778585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.779938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.780112 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.794137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.827230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.108166 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.263677 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff123956-68b6-4a60-ac22-1972b9554205" path="/var/lib/kubelet/pods/ff123956-68b6-4a60-ac22-1972b9554205/volumes" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.281953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282312 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282992 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.283068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.287446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62" (OuterVolumeSpecName: "kube-api-access-pvv62") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "kube-api-access-pvv62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.288819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts" (OuterVolumeSpecName: "scripts") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.311364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.350916 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: W0320 17:40:43.357898 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7f1fae_ee02_4e5c_a06c_9cfacbdc5207.slice/crio-587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47 WatchSource:0}: Error finding container 587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47: Status 404 returned error can't find the container with id 587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.360474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.381752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385007 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385046 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385797 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385810 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385819 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385827 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385844 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.398158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data" (OuterVolumeSpecName: "config-data") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461093 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" exitCode=0 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461131 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" exitCode=2 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461141 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" exitCode=0 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461153 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" exitCode=0 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461157 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461293 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.462374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerStarted","Data":"587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.491503 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.504739 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.521475 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.533387 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.542877 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543255 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543271 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543297 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543304 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543319 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543325 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543337 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543343 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543511 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543528 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543544 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543553 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.545355 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.546536 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.548345 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.548516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.548651 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.565058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.596454 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.621593 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.622152 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622184 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622206 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.622526 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622578 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622613 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.623057 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623085 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623102 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.623337 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623358 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623370 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623555 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623576 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623798 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623818 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624018 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624034 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624199 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624221 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624477 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624752 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624775 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624942 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624964 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625157 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625175 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625361 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625378 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625543 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625559 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625720 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625738 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.626295 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8sl\" (UniqueName: \"kubernetes.io/projected/81c4fa02-a2cf-4349-afe3-292e38b50e33-kube-api-access-5v8sl\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-log-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-config-data\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.695033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-run-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.695081 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.695202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-scripts\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-run-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-scripts\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8sl\" (UniqueName: \"kubernetes.io/projected/81c4fa02-a2cf-4349-afe3-292e38b50e33-kube-api-access-5v8sl\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-log-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-run-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-log-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-config-data\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.804124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.807165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.807225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.812386 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-scripts\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.813872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-config-data\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.820733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8sl\" (UniqueName: \"kubernetes.io/projected/81c4fa02-a2cf-4349-afe3-292e38b50e33-kube-api-access-5v8sl\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.875327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:44 crc kubenswrapper[4795]: W0320 17:40:44.372185 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81c4fa02_a2cf_4349_afe3_292e38b50e33.slice/crio-aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b WatchSource:0}: Error finding container aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b: Status 404 returned error can't find the container with id aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.380210 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.474778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b"} Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.479446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerStarted","Data":"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e"} Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.479475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerStarted","Data":"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb"} Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.518848 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.518828558 podStartE2EDuration="2.518828558s" podCreationTimestamp="2026-03-20 17:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:44.501941123 +0000 UTC m=+1387.959972694" watchObservedRunningTime="2026-03-20 17:40:44.518828558 +0000 UTC m=+1387.976860109" Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.732400 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.265255 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" path="/var/lib/kubelet/pods/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0/volumes" Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.490874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"256d2365325acd1f314b08010c8051a4b1f410b7e0de4b9f549eb9f57929213f"} Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.823240 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.904866 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.905195 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" containerID="cri-o://fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" gracePeriod=10 Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.473433 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474787 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.485885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l" (OuterVolumeSpecName: "kube-api-access-c9h7l") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "kube-api-access-c9h7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.515915 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d499f64-fbe0-4f89-af22-619a306e7857" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" exitCode=0 Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.515984 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.515986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerDied","Data":"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.516016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerDied","Data":"033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.516047 4795 scope.go:117] "RemoveContainer" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.522265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"17e7314f308a8f19b6455d8e266b166b150633defda1fb6d5be070ab71fa388a"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.522297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"4b95271b2da881de9a6e5f405f425092a87dc98b58d5b4d26d8419924b278256"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.536031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.541031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.542587 4795 scope.go:117] "RemoveContainer" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.553344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config" (OuterVolumeSpecName: "config") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.556946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.560790 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.566265 4795 scope.go:117] "RemoveContainer" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" Mar 20 17:40:46 crc kubenswrapper[4795]: E0320 17:40:46.566742 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25\": container with ID starting with fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25 not found: ID does not exist" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.566768 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25"} err="failed to get container status \"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25\": rpc error: code = NotFound desc = could not find container \"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25\": container with ID starting with fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25 not found: ID does not exist" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.566888 4795 scope.go:117] "RemoveContainer" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" Mar 20 17:40:46 crc kubenswrapper[4795]: E0320 17:40:46.567162 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953\": container with ID starting with bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953 not found: ID does not exist" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.567180 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953"} err="failed to get container status \"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953\": rpc error: code = NotFound desc = could not find container \"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953\": container with ID starting with bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953 not found: ID does not exist" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577077 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577124 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577133 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577143 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577154 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577162 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.847196 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.858473 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:47 crc kubenswrapper[4795]: I0320 17:40:47.261521 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" path="/var/lib/kubelet/pods/3d499f64-fbe0-4f89-af22-619a306e7857/volumes" Mar 20 17:40:47 crc kubenswrapper[4795]: I0320 17:40:47.313089 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6697f55ff5-fj55x" podUID="e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 20 17:40:49 crc kubenswrapper[4795]: I0320 17:40:49.732048 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:49 crc kubenswrapper[4795]: I0320 17:40:49.762886 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.570476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"7234f4ee023a6896871077fc9773be9638c0383d34fbfecc5d3d8e15abd99bd9"} Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.595649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.600430 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.530317599 podStartE2EDuration="7.600414079s" podCreationTimestamp="2026-03-20 17:40:43 +0000 UTC" firstStartedPulling="2026-03-20 17:40:44.374333949 +0000 UTC m=+1387.832365490" lastFinishedPulling="2026-03-20 17:40:49.444430389 +0000 UTC m=+1392.902461970" observedRunningTime="2026-03-20 17:40:50.591764175 +0000 UTC m=+1394.049795736" watchObservedRunningTime="2026-03-20 17:40:50.600414079 +0000 UTC m=+1394.058445620" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.743978 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:40:50 crc kubenswrapper[4795]: E0320 17:40:50.745185 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="init" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.745260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="init" Mar 20 17:40:50 crc kubenswrapper[4795]: E0320 17:40:50.745326 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.745383 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.745614 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.746260 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.748488 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.748995 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759843 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.765879 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.824067 4795 scope.go:117] "RemoveContainer" containerID="22dcfbd2225d9c0ffa8966a0b94e82b8d86d62d5548dc394c4f180ba099a7edd" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862234 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.867769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.868898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.869047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.878412 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.067573 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.519299 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:40:51 crc kubenswrapper[4795]: W0320 17:40:51.525632 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbc8602c_1f19_4825_b3e5_32d643f12430.slice/crio-a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28 WatchSource:0}: Error finding container a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28: Status 404 returned error can't find the container with id a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28 Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.585239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerStarted","Data":"a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28"} Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.586195 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.597998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerStarted","Data":"c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb"} Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.633430 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-67csj" podStartSLOduration=2.633401236 podStartE2EDuration="2.633401236s" podCreationTimestamp="2026-03-20 17:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:52.620410439 +0000 UTC m=+1396.078442020" watchObservedRunningTime="2026-03-20 17:40:52.633401236 +0000 UTC m=+1396.091432817" Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.827750 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.827825 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:53 crc kubenswrapper[4795]: I0320 17:40:53.840949 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:53 crc kubenswrapper[4795]: I0320 17:40:53.841597 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:57 crc kubenswrapper[4795]: I0320 17:40:57.653550 4795 generic.go:334] "Generic (PLEG): container finished" podID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerID="c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb" exitCode=0 Mar 20 17:40:57 crc kubenswrapper[4795]: I0320 17:40:57.654139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerDied","Data":"c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb"} Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.097556 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256438 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.263968 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts" (OuterVolumeSpecName: "scripts") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.266721 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4" (OuterVolumeSpecName: "kube-api-access-csvx4") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "kube-api-access-csvx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.294762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data" (OuterVolumeSpecName: "config-data") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.316051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359775 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359815 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359828 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359840 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.677613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerDied","Data":"a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28"} Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.677957 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.677705 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.930504 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.931091 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" containerID="cri-o://d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.931204 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" containerID="cri-o://3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.971028 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.971312 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" containerID="cri-o://380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.987456 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.987701 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" containerID="cri-o://6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.987827 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" containerID="cri-o://4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" gracePeriod=30 Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.471532 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.473727 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.474760 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.474816 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.703547 4795 generic.go:334] "Generic (PLEG): container finished" podID="58141da4-34b7-48d2-8648-8340b0e08c24" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" exitCode=143 Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.703644 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerDied","Data":"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd"} Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.710140 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" exitCode=143 Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.710199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerDied","Data":"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb"} Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.828281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.828342 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.553337 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.619145 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646419 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.647026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.647485 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs" (OuterVolumeSpecName: "logs") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.653340 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8" (OuterVolumeSpecName: "kube-api-access-hhkc8") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "kube-api-access-hhkc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.674704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data" (OuterVolumeSpecName: "config-data") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.690924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.700143 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.720733 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739745 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" exitCode=0 Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerDied","Data":"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739827 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerDied","Data":"587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739843 4795 scope.go:117] "RemoveContainer" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739937 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746085 4795 generic.go:334] "Generic (PLEG): container finished" podID="58141da4-34b7-48d2-8648-8340b0e08c24" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" exitCode=0 Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746174 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerDied","Data":"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerDied","Data":"468acb8849abdd77a144584691b96b8cbadeae923d66dd538f230c0aee8d52cb"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748670 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749189 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749205 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749216 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749224 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749232 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749240 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.750561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs" (OuterVolumeSpecName: "logs") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.751842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf" (OuterVolumeSpecName: "kube-api-access-zb8pf") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "kube-api-access-zb8pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.775077 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data" (OuterVolumeSpecName: "config-data") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.779677 4795 scope.go:117] "RemoveContainer" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.781066 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.807478 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.815385 4795 scope.go:117] "RemoveContainer" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.815824 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e\": container with ID starting with 3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e not found: ID does not exist" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.815856 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e"} err="failed to get container status \"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e\": rpc error: code = NotFound desc = could not find container \"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e\": container with ID starting with 3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.815880 4795 scope.go:117] "RemoveContainer" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.816156 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb\": container with ID starting with d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb not found: ID does not exist" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.816175 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb"} err="failed to get container status \"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb\": rpc error: code = NotFound desc = could not find container \"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb\": container with ID starting with d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.816187 4795 scope.go:117] "RemoveContainer" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.819994 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.834562 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.834966 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.834985 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835011 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835018 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835030 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerName="nova-manage" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835069 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerName="nova-manage" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835083 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835091 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835315 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835330 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835341 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerName="nova-manage" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.836525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.840913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.841016 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.841167 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.846163 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.846444 4795 scope.go:117] "RemoveContainer" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.847643 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850925 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850970 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850983 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850992 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.851001 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.873462 4795 scope.go:117] "RemoveContainer" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.873921 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14\": container with ID starting with 4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14 not found: ID does not exist" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.873948 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14"} err="failed to get container status \"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14\": rpc error: code = NotFound desc = could not find container \"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14\": container with ID starting with 4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14 not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.873969 4795 scope.go:117] "RemoveContainer" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.874175 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd\": container with ID starting with 6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd not found: ID does not exist" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.874211 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd"} err="failed to get container status \"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd\": rpc error: code = NotFound desc = could not find container \"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd\": container with ID starting with 6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkjzt\" (UniqueName: \"kubernetes.io/projected/480a6609-0395-4bda-9ec8-a3ebf30931a7-kube-api-access-xkjzt\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-config-data\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480a6609-0395-4bda-9ec8-a3ebf30931a7-logs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054849 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054915 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkjzt\" (UniqueName: \"kubernetes.io/projected/480a6609-0395-4bda-9ec8-a3ebf30931a7-kube-api-access-xkjzt\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054967 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-config-data\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054990 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.055040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480a6609-0395-4bda-9ec8-a3ebf30931a7-logs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.055061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.056023 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480a6609-0395-4bda-9ec8-a3ebf30931a7-logs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.059770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.060195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.060482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-config-data\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.062672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.076051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkjzt\" (UniqueName: \"kubernetes.io/projected/480a6609-0395-4bda-9ec8-a3ebf30931a7-kube-api-access-xkjzt\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.083561 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.102140 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.139736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.150415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.151936 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.152908 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.153154 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.162233 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.257930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-config-data\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zw7f\" (UniqueName: \"kubernetes.io/projected/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-kube-api-access-6zw7f\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258360 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-logs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.359919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-logs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.360172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.360211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-config-data\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.361502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-logs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.361625 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zw7f\" (UniqueName: \"kubernetes.io/projected/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-kube-api-access-6zw7f\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.361958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.367408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.369500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-config-data\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.384267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.388357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zw7f\" (UniqueName: \"kubernetes.io/projected/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-kube-api-access-6zw7f\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.448534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.598799 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.608390 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759279 4795 generic.go:334] "Generic (PLEG): container finished" podID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" exitCode=0 Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759388 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerDied","Data":"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerDied","Data":"84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759452 4795 scope.go:117] "RemoveContainer" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.760894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"480a6609-0395-4bda-9ec8-a3ebf30931a7","Type":"ContainerStarted","Data":"befd9b02cdc9dc44ea8af5b4b2cc3a53e4fc63ae1c71353770f8bfbe9139169f"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.760922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"480a6609-0395-4bda-9ec8-a3ebf30931a7","Type":"ContainerStarted","Data":"83b850417ad4d2bdfcfb26a398d21951c423e8b6568b671ed54f3611a24d1406"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.769311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"ef24c878-faa8-4a0b-a303-951d0a457eef\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.769425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"ef24c878-faa8-4a0b-a303-951d0a457eef\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.769549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"ef24c878-faa8-4a0b-a303-951d0a457eef\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.773778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx" (OuterVolumeSpecName: "kube-api-access-jgnnx") pod "ef24c878-faa8-4a0b-a303-951d0a457eef" (UID: "ef24c878-faa8-4a0b-a303-951d0a457eef"). InnerVolumeSpecName "kube-api-access-jgnnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.784615 4795 scope.go:117] "RemoveContainer" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" Mar 20 17:41:05 crc kubenswrapper[4795]: E0320 17:41:04.785575 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463\": container with ID starting with 380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463 not found: ID does not exist" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.785612 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463"} err="failed to get container status \"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463\": rpc error: code = NotFound desc = could not find container \"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463\": container with ID starting with 380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463 not found: ID does not exist" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.802308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef24c878-faa8-4a0b-a303-951d0a457eef" (UID: "ef24c878-faa8-4a0b-a303-951d0a457eef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.804813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data" (OuterVolumeSpecName: "config-data") pod "ef24c878-faa8-4a0b-a303-951d0a457eef" (UID: "ef24c878-faa8-4a0b-a303-951d0a457eef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.871836 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.871867 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.871879 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.099426 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.127746 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.160161 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: E0320 17:41:05.160570 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.160584 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.160800 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.161400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.163852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.168520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.270241 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" path="/var/lib/kubelet/pods/58141da4-34b7-48d2-8648-8340b0e08c24/volumes" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.271224 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" path="/var/lib/kubelet/pods/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207/volumes" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.271970 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" path="/var/lib/kubelet/pods/ef24c878-faa8-4a0b-a303-951d0a457eef/volumes" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.279699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-config-data\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.279816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.279860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2lnp\" (UniqueName: \"kubernetes.io/projected/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-kube-api-access-t2lnp\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.382122 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-config-data\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.382248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.382296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2lnp\" (UniqueName: \"kubernetes.io/projected/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-kube-api-access-t2lnp\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.387644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-config-data\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.399956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.400797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2lnp\" (UniqueName: \"kubernetes.io/projected/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-kube-api-access-t2lnp\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.520939 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.545297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.770285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff4af01b-01b5-4154-8591-7ec99e3d6ef0","Type":"ContainerStarted","Data":"799bc58ae9c31f2b20b85a53bf74bab2facc31011fbf66c3eaf5d018bd956aca"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.773743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"480a6609-0395-4bda-9ec8-a3ebf30931a7","Type":"ContainerStarted","Data":"083a7a13f04e5180c257f05356b9040c12dea0510d5f921c7ebdd383b6985b45"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.808598 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.808573643 podStartE2EDuration="2.808573643s" podCreationTimestamp="2026-03-20 17:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:05.793983709 +0000 UTC m=+1409.252015270" watchObservedRunningTime="2026-03-20 17:41:05.808573643 +0000 UTC m=+1409.266605204" Mar 20 17:41:05 crc kubenswrapper[4795]: W0320 17:41:05.984127 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23f56ff_eceb_4891_87e5_57ebeb7eba8d.slice/crio-8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776 WatchSource:0}: Error finding container 8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776: Status 404 returned error can't find the container with id 8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776 Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.987931 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.784929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c23f56ff-eceb-4891-87e5-57ebeb7eba8d","Type":"ContainerStarted","Data":"85945fef7e7e4818965310af4a0201164d31db19ff0f5e8619e3262d43b4c864"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.785231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c23f56ff-eceb-4891-87e5-57ebeb7eba8d","Type":"ContainerStarted","Data":"8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.789058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff4af01b-01b5-4154-8591-7ec99e3d6ef0","Type":"ContainerStarted","Data":"bb468ec1e38e6252f1835d5a48e30b0929f67abd3f7646a4757e2335b49b5959"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.789297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff4af01b-01b5-4154-8591-7ec99e3d6ef0","Type":"ContainerStarted","Data":"b10f2f42fbd9069430b1583e9cb52464dfc2872b3963701e1e04e31f7d461b45"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.807893 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.807872503 podStartE2EDuration="1.807872503s" podCreationTimestamp="2026-03-20 17:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:06.803932512 +0000 UTC m=+1410.261964053" watchObservedRunningTime="2026-03-20 17:41:06.807872503 +0000 UTC m=+1410.265904044" Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.830651 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.830628977 podStartE2EDuration="2.830628977s" podCreationTimestamp="2026-03-20 17:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:06.82742327 +0000 UTC m=+1410.285454851" watchObservedRunningTime="2026-03-20 17:41:06.830628977 +0000 UTC m=+1410.288660528" Mar 20 17:41:10 crc kubenswrapper[4795]: I0320 17:41:10.521347 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:41:11 crc kubenswrapper[4795]: I0320 17:41:11.299931 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:41:11 crc kubenswrapper[4795]: I0320 17:41:11.300289 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:41:13 crc kubenswrapper[4795]: I0320 17:41:13.888515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.152921 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.152959 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.449755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.449819 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.167894 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="480a6609-0395-4bda-9ec8-a3ebf30931a7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.167898 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="480a6609-0395-4bda-9ec8-a3ebf30931a7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.459813 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff4af01b-01b5-4154-8591-7ec99e3d6ef0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.460062 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff4af01b-01b5-4154-8591-7ec99e3d6ef0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.521174 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.566010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.925566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.152863 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.153455 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.449559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.449954 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.159320 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.159739 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.169092 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.170027 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.455604 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.457266 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.461465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.993919 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:41:32 crc kubenswrapper[4795]: I0320 17:41:32.909300 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:33 crc kubenswrapper[4795]: I0320 17:41:33.920078 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:36 crc kubenswrapper[4795]: I0320 17:41:36.038607 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" containerID="cri-o://95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" gracePeriod=57 Mar 20 17:41:36 crc kubenswrapper[4795]: I0320 17:41:36.781148 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" containerID="cri-o://930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3" gracePeriod=58 Mar 20 17:41:37 crc kubenswrapper[4795]: I0320 17:41:37.500176 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 20 17:41:37 crc kubenswrapper[4795]: I0320 17:41:37.832677 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.125667 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerID="930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3" exitCode=0 Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.125770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerDied","Data":"930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3"} Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.417345 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583271 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583317 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583633 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583655 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.584564 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.584891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.588506 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.589033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info" (OuterVolumeSpecName: "pod-info") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.589307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.590773 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.592157 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.604177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k" (OuterVolumeSpecName: "kube-api-access-j6r2k") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "kube-api-access-j6r2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.627416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data" (OuterVolumeSpecName: "config-data") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.636585 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf" (OuterVolumeSpecName: "server-conf") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685572 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685603 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685618 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685649 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685660 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685723 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685738 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685749 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685761 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685771 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.703028 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.711966 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.787808 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.787843 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.143736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerDied","Data":"12a00ee882324adc5e7b3fa5833c8430141d6a20302db2d5f549cf873b0d421d"} Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.143797 4795 scope.go:117] "RemoveContainer" containerID="930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.144143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.167707 4795 scope.go:117] "RemoveContainer" containerID="5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.202022 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.234601 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248096 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: E0320 17:41:39.248571 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248597 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" Mar 20 17:41:39 crc kubenswrapper[4795]: E0320 17:41:39.248635 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="setup-container" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="setup-container" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248870 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.250172 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254149 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254606 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254899 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.255000 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.255134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pf5bc" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.255253 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.278915 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" path="/var/lib/kubelet/pods/d3e6834b-7e74-46f8-a734-b473080c05d3/volumes" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.279646 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.399905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.399981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400137 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400335 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6cbc\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-kube-api-access-c6cbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502947 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6cbc\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-kube-api-access-c6cbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502818 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.503026 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.503130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.504027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.504179 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.504735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.506489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.506490 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.510473 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.512237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.512628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.539099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6cbc\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-kube-api-access-c6cbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.539178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.598011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:40 crc kubenswrapper[4795]: W0320 17:41:40.127626 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c1ffc4_752a_4b0a_a95b_2bfbc458dc53.slice/crio-1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f WatchSource:0}: Error finding container 1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f: Status 404 returned error can't find the container with id 1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f Mar 20 17:41:40 crc kubenswrapper[4795]: I0320 17:41:40.128449 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:40 crc kubenswrapper[4795]: I0320 17:41:40.156108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerStarted","Data":"1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f"} Mar 20 17:41:41 crc kubenswrapper[4795]: I0320 17:41:41.299988 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:41:41 crc kubenswrapper[4795]: I0320 17:41:41.300027 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.178357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerStarted","Data":"91b2765becbe485413b561b8cb2a1ab6831f2a2f0328f3ad53837ee41431baef"} Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.768545 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.871929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873080 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873128 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873157 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873478 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.874774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.879997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.880192 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.880357 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b" (OuterVolumeSpecName: "kube-api-access-w4h5b") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "kube-api-access-w4h5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.887200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.888068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.891879 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.902870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.903967 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.917243 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data" (OuterVolumeSpecName: "config-data") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.975964 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976002 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976014 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976024 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976033 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976059 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976068 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976077 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976085 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976095 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.002141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.003632 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.077885 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.077914 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187310 4795 generic.go:334] "Generic (PLEG): container finished" podID="b8103489-e552-49b0-a32a-1069a46feff9" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" exitCode=0 Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerDied","Data":"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3"} Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187423 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerDied","Data":"0e5a7ece35e45546c5839b24c64b62f1f72a3acb63297d6f97fc0dca60bde01d"} Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187442 4795 scope.go:117] "RemoveContainer" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187576 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.239700 4795 scope.go:117] "RemoveContainer" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.245875 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.267095 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.270133 4795 scope.go:117] "RemoveContainer" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.272597 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3\": container with ID starting with 95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3 not found: ID does not exist" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.272631 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3"} err="failed to get container status \"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3\": rpc error: code = NotFound desc = could not find container \"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3\": container with ID starting with 95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3 not found: ID does not exist" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.272651 4795 scope.go:117] "RemoveContainer" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.275820 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157\": container with ID starting with ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157 not found: ID does not exist" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.275852 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157"} err="failed to get container status \"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157\": rpc error: code = NotFound desc = could not find container \"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157\": container with ID starting with ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157 not found: ID does not exist" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278020 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.278398 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278410 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.278431 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="setup-container" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278437 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="setup-container" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278625 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.279624 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.283826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-84wwv" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.284130 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.284244 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.284448 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.285919 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.285943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.285969 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.295159 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.348799 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8103489_e552_49b0_a32a_1069a46feff9.slice/crio-0e5a7ece35e45546c5839b24c64b62f1f72a3acb63297d6f97fc0dca60bde01d\": RecentStats: unable to find data in memory cache]" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchqp\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-kube-api-access-xchqp\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.409020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.409099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.409728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.511713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.511833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.511959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchqp\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-kube-api-access-xchqp\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512964 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513743 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.514232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.514302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.515746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.530622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.531191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.531272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.533728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.546607 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchqp\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-kube-api-access-xchqp\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.565128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.646542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:44 crc kubenswrapper[4795]: I0320 17:41:44.177108 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:44 crc kubenswrapper[4795]: W0320 17:41:44.183566 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad841f4d_fa5f_4383_86d5_ab5a93f6e7fc.slice/crio-3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d WatchSource:0}: Error finding container 3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d: Status 404 returned error can't find the container with id 3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d Mar 20 17:41:44 crc kubenswrapper[4795]: I0320 17:41:44.199175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerStarted","Data":"3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d"} Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.265494 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8103489-e552-49b0-a32a-1069a46feff9" path="/var/lib/kubelet/pods/b8103489-e552-49b0-a32a-1069a46feff9/volumes" Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.966499 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.968622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.970525 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.997073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172361 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172581 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.173064 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.173338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.173359 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.189404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.219081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerStarted","Data":"6c9bb98a5b27dfaab93f9b0bf86e2dc36843d779c16099e2338e7ce5f1541db7"} Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.290762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.547153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:41:47 crc kubenswrapper[4795]: I0320 17:41:47.231953 4795 generic.go:334] "Generic (PLEG): container finished" podID="38f2311f-ace5-4469-906b-05443d175f81" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" exitCode=0 Mar 20 17:41:47 crc kubenswrapper[4795]: I0320 17:41:47.232030 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerDied","Data":"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff"} Mar 20 17:41:47 crc kubenswrapper[4795]: I0320 17:41:47.232423 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerStarted","Data":"8a4135aec749bd5b9e65098bed9d5a85b6fe9d80a88ef7fade7429aaedbcd5f3"} Mar 20 17:41:48 crc kubenswrapper[4795]: I0320 17:41:48.243914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerStarted","Data":"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824"} Mar 20 17:41:48 crc kubenswrapper[4795]: I0320 17:41:48.244470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:48 crc kubenswrapper[4795]: I0320 17:41:48.284103 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" podStartSLOduration=3.284077361 podStartE2EDuration="3.284077361s" podCreationTimestamp="2026-03-20 17:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:48.271799247 +0000 UTC m=+1451.729830868" watchObservedRunningTime="2026-03-20 17:41:48.284077361 +0000 UTC m=+1451.742108942" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.162485 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.166290 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.184881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.292906 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.325786 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.325962 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.326100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.378145 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.378460 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" containerID="cri-o://9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" gracePeriod=10 Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.431282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.431739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.431818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.433368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.433903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.460203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.528068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.565479 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ch8jm"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.568414 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.610660 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ch8jm"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-config\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdc26\" (UniqueName: \"kubernetes.io/projected/5c5c2934-fe58-4707-9bb7-a5e2372bad83-kube-api-access-kdc26\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.742836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.742987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdc26\" (UniqueName: \"kubernetes.io/projected/5c5c2934-fe58-4707-9bb7-a5e2372bad83-kube-api-access-kdc26\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743190 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743284 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743384 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-config\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.744317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.744707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.744978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.745270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.745621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.745817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-config\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.791380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdc26\" (UniqueName: \"kubernetes.io/projected/5c5c2934-fe58-4707-9bb7-a5e2372bad83-kube-api-access-kdc26\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.006755 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.013040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049812 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.050076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.050143 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.058444 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j" (OuterVolumeSpecName: "kube-api-access-dmn5j") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "kube-api-access-dmn5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.104259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.108240 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.112068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.121197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config" (OuterVolumeSpecName: "config") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.123024 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156108 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156133 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156142 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156150 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156158 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156167 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.231071 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.428808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerStarted","Data":"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.428874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerStarted","Data":"5fc09bb80d702cbcbe035b81e8b835d219b1710a866cf58373f31844d607a73c"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.431410 4795 generic.go:334] "Generic (PLEG): container finished" podID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" exitCode=0 Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.431449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.431466 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerDied","Data":"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.432002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerDied","Data":"54687987edb9f2765e8d4f7b8bfef3664f1024d2c67848e40765c69ff1c22cea"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.432041 4795 scope.go:117] "RemoveContainer" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.449912 4795 scope.go:117] "RemoveContainer" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.460567 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.467589 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.473324 4795 scope.go:117] "RemoveContainer" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" Mar 20 17:41:57 crc kubenswrapper[4795]: E0320 17:41:57.474951 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8\": container with ID starting with 9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8 not found: ID does not exist" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.474990 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8"} err="failed to get container status \"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8\": rpc error: code = NotFound desc = could not find container \"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8\": container with ID starting with 9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8 not found: ID does not exist" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.475012 4795 scope.go:117] "RemoveContainer" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" Mar 20 17:41:57 crc kubenswrapper[4795]: E0320 17:41:57.475918 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087\": container with ID starting with b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087 not found: ID does not exist" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.475974 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087"} err="failed to get container status \"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087\": rpc error: code = NotFound desc = could not find container \"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087\": container with ID starting with b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087 not found: ID does not exist" Mar 20 17:41:57 crc kubenswrapper[4795]: W0320 17:41:57.499512 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c5c2934_fe58_4707_9bb7_a5e2372bad83.slice/crio-ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3 WatchSource:0}: Error finding container ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3: Status 404 returned error can't find the container with id ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3 Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.504879 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ch8jm"] Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.445113 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac35d627-20df-4aad-9779-e154f9cb617a" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" exitCode=0 Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.445184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde"} Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.453601 4795 generic.go:334] "Generic (PLEG): container finished" podID="5c5c2934-fe58-4707-9bb7-a5e2372bad83" containerID="e5dd02238cefff0ea9b15c3254b140992155778ea40801eac45bd1f08b16bf9c" exitCode=0 Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.453679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" event={"ID":"5c5c2934-fe58-4707-9bb7-a5e2372bad83","Type":"ContainerDied","Data":"e5dd02238cefff0ea9b15c3254b140992155778ea40801eac45bd1f08b16bf9c"} Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.453796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" event={"ID":"5c5c2934-fe58-4707-9bb7-a5e2372bad83","Type":"ContainerStarted","Data":"ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3"} Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.278154 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" path="/var/lib/kubelet/pods/35b8efb0-212f-4ee0-bb05-4655aff260b5/volumes" Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.479244 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" event={"ID":"5c5c2934-fe58-4707-9bb7-a5e2372bad83","Type":"ContainerStarted","Data":"de2ab699cb1b7869321e76fdc3a3051d733e4a49eed44f4ec3e8d28f0e328652"} Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.479438 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.507928 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" podStartSLOduration=3.507905884 podStartE2EDuration="3.507905884s" podCreationTimestamp="2026-03-20 17:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:59.503011725 +0000 UTC m=+1462.961043276" watchObservedRunningTime="2026-03-20 17:41:59.507905884 +0000 UTC m=+1462.965937425" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.161487 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:42:00 crc kubenswrapper[4795]: E0320 17:42:00.161943 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.161964 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" Mar 20 17:42:00 crc kubenswrapper[4795]: E0320 17:42:00.162019 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="init" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.162028 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="init" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.162247 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.163001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.167063 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.167997 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.168009 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.178177 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.223891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"auto-csr-approver-29567142-zrn58\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.326336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"auto-csr-approver-29567142-zrn58\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.346602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"auto-csr-approver-29567142-zrn58\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.482981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.491236 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac35d627-20df-4aad-9779-e154f9cb617a" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" exitCode=0 Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.491331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5"} Mar 20 17:42:00 crc kubenswrapper[4795]: W0320 17:42:00.997536 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf931d18_2dae_408e_823d_45c28b0a31c2.slice/crio-1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6 WatchSource:0}: Error finding container 1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6: Status 404 returned error can't find the container with id 1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6 Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.014184 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.500282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerStarted","Data":"1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6"} Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.503934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerStarted","Data":"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66"} Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.527016 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rxkd" podStartSLOduration=3.075351152 podStartE2EDuration="5.526996117s" podCreationTimestamp="2026-03-20 17:41:56 +0000 UTC" firstStartedPulling="2026-03-20 17:41:58.447928263 +0000 UTC m=+1461.905959844" lastFinishedPulling="2026-03-20 17:42:00.899573258 +0000 UTC m=+1464.357604809" observedRunningTime="2026-03-20 17:42:01.518984482 +0000 UTC m=+1464.977016033" watchObservedRunningTime="2026-03-20 17:42:01.526996117 +0000 UTC m=+1464.985027668" Mar 20 17:42:03 crc kubenswrapper[4795]: I0320 17:42:03.533361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerStarted","Data":"ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b"} Mar 20 17:42:04 crc kubenswrapper[4795]: I0320 17:42:04.548850 4795 generic.go:334] "Generic (PLEG): container finished" podID="df931d18-2dae-408e-823d-45c28b0a31c2" containerID="ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b" exitCode=0 Mar 20 17:42:04 crc kubenswrapper[4795]: I0320 17:42:04.548987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerDied","Data":"ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b"} Mar 20 17:42:04 crc kubenswrapper[4795]: I0320 17:42:04.937455 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.033282 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"df931d18-2dae-408e-823d-45c28b0a31c2\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.039496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65" (OuterVolumeSpecName: "kube-api-access-r4s65") pod "df931d18-2dae-408e-823d-45c28b0a31c2" (UID: "df931d18-2dae-408e-823d-45c28b0a31c2"). InnerVolumeSpecName "kube-api-access-r4s65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.135248 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.562657 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerDied","Data":"1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6"} Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.563053 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.562796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.031425 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.043916 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.528517 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.528602 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.619207 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.680658 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:06 crc kubenswrapper[4795]: E0320 17:42:06.681217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" containerName="oc" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.681241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" containerName="oc" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.681490 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" containerName="oc" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.709966 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.710158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.743151 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.772181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.772323 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.772403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.873678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.873815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.873881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.874458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.874532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.892882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.014859 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.048499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.075651 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.075956 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" containerID="cri-o://c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" gracePeriod=10 Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.264740 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" path="/var/lib/kubelet/pods/38f88deb-b38d-4c52-a901-baeb9da08559/volumes" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.574034 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.583491 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.613512 4795 generic.go:334] "Generic (PLEG): container finished" podID="38f2311f-ace5-4469-906b-05443d175f81" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" exitCode=0 Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614414 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerDied","Data":"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824"} Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614930 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerDied","Data":"8a4135aec749bd5b9e65098bed9d5a85b6fe9d80a88ef7fade7429aaedbcd5f3"} Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614950 4795 scope.go:117] "RemoveContainer" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.682974 4795 scope.go:117] "RemoveContainer" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692174 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692832 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.707567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z" (OuterVolumeSpecName: "kube-api-access-dqx4z") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "kube-api-access-dqx4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.727863 4795 scope.go:117] "RemoveContainer" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" Mar 20 17:42:07 crc kubenswrapper[4795]: E0320 17:42:07.730456 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824\": container with ID starting with c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824 not found: ID does not exist" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.730526 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824"} err="failed to get container status \"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824\": rpc error: code = NotFound desc = could not find container \"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824\": container with ID starting with c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824 not found: ID does not exist" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.730555 4795 scope.go:117] "RemoveContainer" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" Mar 20 17:42:07 crc kubenswrapper[4795]: E0320 17:42:07.732301 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff\": container with ID starting with cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff not found: ID does not exist" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.732329 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff"} err="failed to get container status \"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff\": rpc error: code = NotFound desc = could not find container \"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff\": container with ID starting with cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff not found: ID does not exist" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.765812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config" (OuterVolumeSpecName: "config") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.765855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.769839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.782766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.791063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.795959 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.795998 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796012 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796025 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796036 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796048 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.804080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.897243 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.986277 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.996946 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.217872 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631149 4795 generic.go:334] "Generic (PLEG): container finished" podID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" exitCode=0 Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631246 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa"} Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerStarted","Data":"fabd95db1d654ea27cdf8aa4144ea02558f6e4e1468eabf7d4dab5f79064ffa1"} Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631472 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rxkd" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" containerID="cri-o://f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" gracePeriod=2 Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.106785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.223124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"ac35d627-20df-4aad-9779-e154f9cb617a\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.223173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"ac35d627-20df-4aad-9779-e154f9cb617a\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.223245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"ac35d627-20df-4aad-9779-e154f9cb617a\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.224133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities" (OuterVolumeSpecName: "utilities") pod "ac35d627-20df-4aad-9779-e154f9cb617a" (UID: "ac35d627-20df-4aad-9779-e154f9cb617a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.224375 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.231554 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg" (OuterVolumeSpecName: "kube-api-access-2vrmg") pod "ac35d627-20df-4aad-9779-e154f9cb617a" (UID: "ac35d627-20df-4aad-9779-e154f9cb617a"). InnerVolumeSpecName "kube-api-access-2vrmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.278414 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f2311f-ace5-4469-906b-05443d175f81" path="/var/lib/kubelet/pods/38f2311f-ace5-4469-906b-05443d175f81/volumes" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.327265 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.328766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac35d627-20df-4aad-9779-e154f9cb617a" (UID: "ac35d627-20df-4aad-9779-e154f9cb617a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.429120 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687749 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac35d627-20df-4aad-9779-e154f9cb617a" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" exitCode=0 Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66"} Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"5fc09bb80d702cbcbe035b81e8b835d219b1710a866cf58373f31844d607a73c"} Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687922 4795 scope.go:117] "RemoveContainer" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687987 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.710074 4795 scope.go:117] "RemoveContainer" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.739790 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.741006 4795 scope.go:117] "RemoveContainer" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.749029 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.811898 4795 scope.go:117] "RemoveContainer" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" Mar 20 17:42:09 crc kubenswrapper[4795]: E0320 17:42:09.812469 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66\": container with ID starting with f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66 not found: ID does not exist" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812511 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66"} err="failed to get container status \"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66\": rpc error: code = NotFound desc = could not find container \"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66\": container with ID starting with f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66 not found: ID does not exist" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812537 4795 scope.go:117] "RemoveContainer" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" Mar 20 17:42:09 crc kubenswrapper[4795]: E0320 17:42:09.812870 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5\": container with ID starting with e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5 not found: ID does not exist" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812910 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5"} err="failed to get container status \"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5\": rpc error: code = NotFound desc = could not find container \"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5\": container with ID starting with e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5 not found: ID does not exist" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812935 4795 scope.go:117] "RemoveContainer" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" Mar 20 17:42:09 crc kubenswrapper[4795]: E0320 17:42:09.813193 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde\": container with ID starting with a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde not found: ID does not exist" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.813216 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde"} err="failed to get container status \"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde\": rpc error: code = NotFound desc = could not find container \"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde\": container with ID starting with a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde not found: ID does not exist" Mar 20 17:42:10 crc kubenswrapper[4795]: I0320 17:42:10.703354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerStarted","Data":"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561"} Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.274370 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" path="/var/lib/kubelet/pods/ac35d627-20df-4aad-9779-e154f9cb617a/volumes" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.301074 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.301173 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.301239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.303001 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.303137 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1" gracePeriod=600 Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.734382 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1" exitCode=0 Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.734447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1"} Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.735050 4795 scope.go:117] "RemoveContainer" containerID="98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd" Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.739634 4795 generic.go:334] "Generic (PLEG): container finished" podID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" exitCode=0 Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.739715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.758751 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerStarted","Data":"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.762337 4795 generic.go:334] "Generic (PLEG): container finished" podID="30c1ffc4-752a-4b0a-a95b-2bfbc458dc53" containerID="91b2765becbe485413b561b8cb2a1ab6831f2a2f0328f3ad53837ee41431baef" exitCode=0 Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.762429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerDied","Data":"91b2765becbe485413b561b8cb2a1ab6831f2a2f0328f3ad53837ee41431baef"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.772085 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.803808 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lcjkp" podStartSLOduration=3.062137191 podStartE2EDuration="7.803787006s" podCreationTimestamp="2026-03-20 17:42:06 +0000 UTC" firstStartedPulling="2026-03-20 17:42:08.636319115 +0000 UTC m=+1472.094350666" lastFinishedPulling="2026-03-20 17:42:13.37796893 +0000 UTC m=+1476.836000481" observedRunningTime="2026-03-20 17:42:13.787839569 +0000 UTC m=+1477.245871150" watchObservedRunningTime="2026-03-20 17:42:13.803787006 +0000 UTC m=+1477.261818547" Mar 20 17:42:14 crc kubenswrapper[4795]: I0320 17:42:14.782180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerStarted","Data":"4866f5f245a841d3b385d7f1801c46ad49f8591588f65bbe7c1664b1fe275b87"} Mar 20 17:42:14 crc kubenswrapper[4795]: I0320 17:42:14.783030 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:42:14 crc kubenswrapper[4795]: I0320 17:42:14.819845 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.819818875 podStartE2EDuration="35.819818875s" podCreationTimestamp="2026-03-20 17:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:42:14.806804908 +0000 UTC m=+1478.264836489" watchObservedRunningTime="2026-03-20 17:42:14.819818875 +0000 UTC m=+1478.277850446" Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.049383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.050054 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.815304 4795 generic.go:334] "Generic (PLEG): container finished" podID="ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc" containerID="6c9bb98a5b27dfaab93f9b0bf86e2dc36843d779c16099e2338e7ce5f1541db7" exitCode=0 Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.815386 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerDied","Data":"6c9bb98a5b27dfaab93f9b0bf86e2dc36843d779c16099e2338e7ce5f1541db7"} Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.128318 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" probeResult="failure" output=< Mar 20 17:42:18 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:42:18 crc kubenswrapper[4795]: > Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.825036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerStarted","Data":"20aa4d5b499482117c4d3947f9429d441904750ad7290a14f19d2ebe60c52bab"} Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.826311 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.858913 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.858891258 podStartE2EDuration="35.858891258s" podCreationTimestamp="2026-03-20 17:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:42:18.856181676 +0000 UTC m=+1482.314213237" watchObservedRunningTime="2026-03-20 17:42:18.858891258 +0000 UTC m=+1482.316922799" Mar 20 17:42:28 crc kubenswrapper[4795]: I0320 17:42:28.117232 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" probeResult="failure" output=< Mar 20 17:42:28 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:42:28 crc kubenswrapper[4795]: > Mar 20 17:42:29 crc kubenswrapper[4795]: I0320 17:42:29.603015 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.076598 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk"] Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077089 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077107 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077128 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077135 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="init" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077160 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="init" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077169 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-utilities" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077176 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-utilities" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077204 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-content" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077211 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-content" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077422 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077441 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.078104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080191 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080498 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080681 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.107423 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk"] Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241326 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343257 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.349096 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.349740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.351034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.359855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.417405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.946394 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk"] Mar 20 17:42:30 crc kubenswrapper[4795]: W0320 17:42:30.958830 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7bbeb5c_0f49_4fb3_b0b4_57c9bf91977e.slice/crio-5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9 WatchSource:0}: Error finding container 5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9: Status 404 returned error can't find the container with id 5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9 Mar 20 17:42:31 crc kubenswrapper[4795]: I0320 17:42:31.938446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerStarted","Data":"5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9"} Mar 20 17:42:33 crc kubenswrapper[4795]: I0320 17:42:33.651019 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:42:38 crc kubenswrapper[4795]: I0320 17:42:38.096972 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" probeResult="failure" output=< Mar 20 17:42:38 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:42:38 crc kubenswrapper[4795]: > Mar 20 17:42:40 crc kubenswrapper[4795]: I0320 17:42:40.894883 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:42:42 crc kubenswrapper[4795]: I0320 17:42:42.029870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerStarted","Data":"46997074f9c4b9e4f4b38226f8f5a757e2619d320bdcf5429c1321ed40595a79"} Mar 20 17:42:42 crc kubenswrapper[4795]: I0320 17:42:42.057958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" podStartSLOduration=2.125776813 podStartE2EDuration="12.057940422s" podCreationTimestamp="2026-03-20 17:42:30 +0000 UTC" firstStartedPulling="2026-03-20 17:42:30.960493025 +0000 UTC m=+1494.418524566" lastFinishedPulling="2026-03-20 17:42:40.892656624 +0000 UTC m=+1504.350688175" observedRunningTime="2026-03-20 17:42:42.052473277 +0000 UTC m=+1505.510504818" watchObservedRunningTime="2026-03-20 17:42:42.057940422 +0000 UTC m=+1505.515971963" Mar 20 17:42:47 crc kubenswrapper[4795]: I0320 17:42:47.108499 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:47 crc kubenswrapper[4795]: I0320 17:42:47.187436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:47 crc kubenswrapper[4795]: I0320 17:42:47.382635 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.097827 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" containerID="cri-o://d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" gracePeriod=2 Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.685550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.761758 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"196de415-75ca-4b43-bb26-0a9a5a993b1e\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.761853 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"196de415-75ca-4b43-bb26-0a9a5a993b1e\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.762061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"196de415-75ca-4b43-bb26-0a9a5a993b1e\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.762675 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities" (OuterVolumeSpecName: "utilities") pod "196de415-75ca-4b43-bb26-0a9a5a993b1e" (UID: "196de415-75ca-4b43-bb26-0a9a5a993b1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.768980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4" (OuterVolumeSpecName: "kube-api-access-rjss4") pod "196de415-75ca-4b43-bb26-0a9a5a993b1e" (UID: "196de415-75ca-4b43-bb26-0a9a5a993b1e"). InnerVolumeSpecName "kube-api-access-rjss4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.801893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:42:49 crc kubenswrapper[4795]: E0320 17:42:49.802320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" Mar 20 17:42:49 crc kubenswrapper[4795]: E0320 17:42:49.802355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-content" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802365 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-content" Mar 20 17:42:49 crc kubenswrapper[4795]: E0320 17:42:49.802407 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-utilities" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802417 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-utilities" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802626 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.805362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.823551 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-utilities\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-catalog-content\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78b5w\" (UniqueName: \"kubernetes.io/projected/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-kube-api-access-78b5w\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864800 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864835 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.912139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "196de415-75ca-4b43-bb26-0a9a5a993b1e" (UID: "196de415-75ca-4b43-bb26-0a9a5a993b1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.966644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-utilities\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.966945 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-catalog-content\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967085 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78b5w\" (UniqueName: \"kubernetes.io/projected/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-kube-api-access-78b5w\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967293 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-utilities\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-catalog-content\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.985184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78b5w\" (UniqueName: \"kubernetes.io/projected/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-kube-api-access-78b5w\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108049 4795 generic.go:334] "Generic (PLEG): container finished" podID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" exitCode=0 Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a"} Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"fabd95db1d654ea27cdf8aa4144ea02558f6e4e1468eabf7d4dab5f79064ffa1"} Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108148 4795 scope.go:117] "RemoveContainer" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108282 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.132855 4795 scope.go:117] "RemoveContainer" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.156680 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.157232 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.165412 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.206967 4795 scope.go:117] "RemoveContainer" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.276922 4795 scope.go:117] "RemoveContainer" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" Mar 20 17:42:50 crc kubenswrapper[4795]: E0320 17:42:50.277705 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a\": container with ID starting with d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a not found: ID does not exist" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.277733 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a"} err="failed to get container status \"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a\": rpc error: code = NotFound desc = could not find container \"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a\": container with ID starting with d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a not found: ID does not exist" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.277752 4795 scope.go:117] "RemoveContainer" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" Mar 20 17:42:50 crc kubenswrapper[4795]: E0320 17:42:50.278133 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561\": container with ID starting with 143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561 not found: ID does not exist" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.278178 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561"} err="failed to get container status \"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561\": rpc error: code = NotFound desc = could not find container \"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561\": container with ID starting with 143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561 not found: ID does not exist" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.278208 4795 scope.go:117] "RemoveContainer" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" Mar 20 17:42:50 crc kubenswrapper[4795]: E0320 17:42:50.278458 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa\": container with ID starting with d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa not found: ID does not exist" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.278480 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa"} err="failed to get container status \"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa\": rpc error: code = NotFound desc = could not find container \"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa\": container with ID starting with d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa not found: ID does not exist" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.706826 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.127018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerStarted","Data":"d7167042db96168386712bff371b2d9d2d5243adbff6d1143fb5d96c2d30f752"} Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.127113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerStarted","Data":"1e3d77ed281152e7161201d4f3863784c405b8d5680af3ee5b4cdedffa226744"} Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.228775 4795 scope.go:117] "RemoveContainer" containerID="a9b37f38da5a02a709d41d5d8718cdfaffaae9f225a892b8a803fd7f9d1c5b9d" Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.262404 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" path="/var/lib/kubelet/pods/196de415-75ca-4b43-bb26-0a9a5a993b1e/volumes" Mar 20 17:42:52 crc kubenswrapper[4795]: I0320 17:42:52.144789 4795 generic.go:334] "Generic (PLEG): container finished" podID="c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0" containerID="d7167042db96168386712bff371b2d9d2d5243adbff6d1143fb5d96c2d30f752" exitCode=0 Mar 20 17:42:52 crc kubenswrapper[4795]: I0320 17:42:52.144873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerDied","Data":"d7167042db96168386712bff371b2d9d2d5243adbff6d1143fb5d96c2d30f752"} Mar 20 17:42:55 crc kubenswrapper[4795]: I0320 17:42:55.172321 4795 generic.go:334] "Generic (PLEG): container finished" podID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerID="46997074f9c4b9e4f4b38226f8f5a757e2619d320bdcf5429c1321ed40595a79" exitCode=0 Mar 20 17:42:55 crc kubenswrapper[4795]: I0320 17:42:55.172363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerDied","Data":"46997074f9c4b9e4f4b38226f8f5a757e2619d320bdcf5429c1321ed40595a79"} Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.396174 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.572925 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.573384 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2" (OuterVolumeSpecName: "kube-api-access-r44j2") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "kube-api-access-r44j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.606542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory" (OuterVolumeSpecName: "inventory") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.627723 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668221 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668252 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668263 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668272 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.203121 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.203134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerDied","Data":"5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9"} Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.203290 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.207640 4795 generic.go:334] "Generic (PLEG): container finished" podID="c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0" containerID="eb283ca4787e407dd208cf425b8606a032f09859fc48d162cab700175d647056" exitCode=0 Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.207716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerDied","Data":"eb283ca4787e407dd208cf425b8606a032f09859fc48d162cab700175d647056"} Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.510716 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9"] Mar 20 17:42:58 crc kubenswrapper[4795]: E0320 17:42:58.511280 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.511304 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.511626 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.512515 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.515228 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.515255 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.516188 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.518474 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.532043 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9"] Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.687469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.688795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.689095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.791254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.791414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.791564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.800014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.807734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.816504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.831475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:59 crc kubenswrapper[4795]: I0320 17:42:59.231121 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9"] Mar 20 17:43:00 crc kubenswrapper[4795]: I0320 17:43:00.229675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerStarted","Data":"f6ada8e1c8713492111085ea654640396f7aadd1a41c917769fc30db45524b2e"} Mar 20 17:43:00 crc kubenswrapper[4795]: I0320 17:43:00.231198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerStarted","Data":"70ece47a35ac9935a28b0de5740030bafff6f54285749e61a4fad5cb6bada8f0"} Mar 20 17:43:00 crc kubenswrapper[4795]: I0320 17:43:00.258914 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gtd2l" podStartSLOduration=4.450067076 podStartE2EDuration="11.258891691s" podCreationTimestamp="2026-03-20 17:42:49 +0000 UTC" firstStartedPulling="2026-03-20 17:42:52.146793331 +0000 UTC m=+1515.604824912" lastFinishedPulling="2026-03-20 17:42:58.955617976 +0000 UTC m=+1522.413649527" observedRunningTime="2026-03-20 17:43:00.250103306 +0000 UTC m=+1523.708134867" watchObservedRunningTime="2026-03-20 17:43:00.258891691 +0000 UTC m=+1523.716923232" Mar 20 17:43:01 crc kubenswrapper[4795]: I0320 17:43:01.241000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerStarted","Data":"36715ba352620b7792388f384fb4873d45c3ad7407a718da6d06c5542fddbcf2"} Mar 20 17:43:01 crc kubenswrapper[4795]: I0320 17:43:01.260005 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" podStartSLOduration=2.376708208 podStartE2EDuration="3.259987033s" podCreationTimestamp="2026-03-20 17:42:58 +0000 UTC" firstStartedPulling="2026-03-20 17:42:59.238441795 +0000 UTC m=+1522.696473346" lastFinishedPulling="2026-03-20 17:43:00.12172063 +0000 UTC m=+1523.579752171" observedRunningTime="2026-03-20 17:43:01.256777307 +0000 UTC m=+1524.714808848" watchObservedRunningTime="2026-03-20 17:43:01.259987033 +0000 UTC m=+1524.718018574" Mar 20 17:43:03 crc kubenswrapper[4795]: I0320 17:43:03.266091 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerID="36715ba352620b7792388f384fb4873d45c3ad7407a718da6d06c5542fddbcf2" exitCode=0 Mar 20 17:43:03 crc kubenswrapper[4795]: I0320 17:43:03.267056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerDied","Data":"36715ba352620b7792388f384fb4873d45c3ad7407a718da6d06c5542fddbcf2"} Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.713616 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.913668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.913779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.913889 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.918321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq" (OuterVolumeSpecName: "kube-api-access-pjvtq") pod "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" (UID: "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936"). InnerVolumeSpecName "kube-api-access-pjvtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.943140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" (UID: "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.952805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory" (OuterVolumeSpecName: "inventory") pod "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" (UID: "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.016654 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.016734 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.016749 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.291336 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerDied","Data":"70ece47a35ac9935a28b0de5740030bafff6f54285749e61a4fad5cb6bada8f0"} Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.291709 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ece47a35ac9935a28b0de5740030bafff6f54285749e61a4fad5cb6bada8f0" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.291411 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.374780 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps"] Mar 20 17:43:05 crc kubenswrapper[4795]: E0320 17:43:05.375159 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.375177 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.375361 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.376060 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.382047 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.382453 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.382666 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.390053 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.427106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps"] Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.447904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.447969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.448068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.448445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.557758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.557969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.559188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.572136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.705838 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:06 crc kubenswrapper[4795]: I0320 17:43:06.289854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps"] Mar 20 17:43:07 crc kubenswrapper[4795]: I0320 17:43:07.316061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerStarted","Data":"8048e17374d65a5593e7f3026aacfa891127041d59875217add93959662e9cdc"} Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.158717 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.159304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.245619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.388025 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.568921 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.625363 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.625641 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tw8kt" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" containerID="cri-o://e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b" gracePeriod=2 Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.358394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerStarted","Data":"c179fa5fa1c857ee73ee0d25264e475b241069dffa52757206eb76081b38cae9"} Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.363862 4795 generic.go:334] "Generic (PLEG): container finished" podID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerID="e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b" exitCode=0 Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.364870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b"} Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.378090 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" podStartSLOduration=4.301181573 podStartE2EDuration="6.378076604s" podCreationTimestamp="2026-03-20 17:43:05 +0000 UTC" firstStartedPulling="2026-03-20 17:43:06.304700582 +0000 UTC m=+1529.762732113" lastFinishedPulling="2026-03-20 17:43:08.381595603 +0000 UTC m=+1531.839627144" observedRunningTime="2026-03-20 17:43:11.3742587 +0000 UTC m=+1534.832290251" watchObservedRunningTime="2026-03-20 17:43:11.378076604 +0000 UTC m=+1534.836108145" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.865950 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.885377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"0cba71d7-62e8-4541-9728-23dd5ff4b982\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.885446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"0cba71d7-62e8-4541-9728-23dd5ff4b982\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.885533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"0cba71d7-62e8-4541-9728-23dd5ff4b982\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.886542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities" (OuterVolumeSpecName: "utilities") pod "0cba71d7-62e8-4541-9728-23dd5ff4b982" (UID: "0cba71d7-62e8-4541-9728-23dd5ff4b982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.897508 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j" (OuterVolumeSpecName: "kube-api-access-m8j2j") pod "0cba71d7-62e8-4541-9728-23dd5ff4b982" (UID: "0cba71d7-62e8-4541-9728-23dd5ff4b982"). InnerVolumeSpecName "kube-api-access-m8j2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.943869 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cba71d7-62e8-4541-9728-23dd5ff4b982" (UID: "0cba71d7-62e8-4541-9728-23dd5ff4b982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.988239 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.988270 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.988280 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.377763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5"} Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.377838 4795 scope.go:117] "RemoveContainer" containerID="e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.377882 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.410166 4795 scope.go:117] "RemoveContainer" containerID="8963fe7721a09d9f6c228e790432497f6b1fff70d60afc4485e7fcd92391890f" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.429826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.440910 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.457298 4795 scope.go:117] "RemoveContainer" containerID="099eb6fe1b44619943ee789acf319c90001ea00f649ef59a36a0aa98e76bd549" Mar 20 17:43:13 crc kubenswrapper[4795]: I0320 17:43:13.261099 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" path="/var/lib/kubelet/pods/0cba71d7-62e8-4541-9728-23dd5ff4b982/volumes" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.411568 4795 scope.go:117] "RemoveContainer" containerID="b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.520603 4795 scope.go:117] "RemoveContainer" containerID="9df2c204a93d51c554ceaf159d1f9366b95bed6cc7f2757ae8ae8edae396f498" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.552542 4795 scope.go:117] "RemoveContainer" containerID="64da53ed17d6e7c8ed644863f568fc0f6e5e946972ad8fd66ba6db39c157b1e6" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.614280 4795 scope.go:117] "RemoveContainer" containerID="788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.153884 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:44:00 crc kubenswrapper[4795]: E0320 17:44:00.155061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155083 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4795]: E0320 17:44:00.155111 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155122 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4795]: E0320 17:44:00.155143 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155155 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155475 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.156426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.159329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.159635 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.160331 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.168388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.279529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"auto-csr-approver-29567144-khl6t\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.381129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"auto-csr-approver-29567144-khl6t\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.418847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"auto-csr-approver-29567144-khl6t\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.477659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.944276 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.954735 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:44:01 crc kubenswrapper[4795]: I0320 17:44:01.940929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-khl6t" event={"ID":"a6396cd8-bc19-4f24-ae36-12356bfa8133","Type":"ContainerStarted","Data":"24b1cd71bef426c5b45ef08818327c53e150c64c24e186695f96c00266ca4afc"} Mar 20 17:44:03 crc kubenswrapper[4795]: I0320 17:44:03.960053 4795 generic.go:334] "Generic (PLEG): container finished" podID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerID="29087c37b0e22594df358a498bb26205f2050bb1e4a607372b3a2ba3b4df8dd7" exitCode=0 Mar 20 17:44:03 crc kubenswrapper[4795]: I0320 17:44:03.960109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-khl6t" event={"ID":"a6396cd8-bc19-4f24-ae36-12356bfa8133","Type":"ContainerDied","Data":"29087c37b0e22594df358a498bb26205f2050bb1e4a607372b3a2ba3b4df8dd7"} Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.282975 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.380548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"a6396cd8-bc19-4f24-ae36-12356bfa8133\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.387216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd" (OuterVolumeSpecName: "kube-api-access-hn6bd") pod "a6396cd8-bc19-4f24-ae36-12356bfa8133" (UID: "a6396cd8-bc19-4f24-ae36-12356bfa8133"). InnerVolumeSpecName "kube-api-access-hn6bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.483294 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") on node \"crc\" DevicePath \"\"" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.976874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-khl6t" event={"ID":"a6396cd8-bc19-4f24-ae36-12356bfa8133","Type":"ContainerDied","Data":"24b1cd71bef426c5b45ef08818327c53e150c64c24e186695f96c00266ca4afc"} Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.976927 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b1cd71bef426c5b45ef08818327c53e150c64c24e186695f96c00266ca4afc" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.976995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:06 crc kubenswrapper[4795]: I0320 17:44:06.353449 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:44:06 crc kubenswrapper[4795]: I0320 17:44:06.361719 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:44:07 crc kubenswrapper[4795]: I0320 17:44:07.265168 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" path="/var/lib/kubelet/pods/e83d2a1a-2b3b-409a-997a-672e322b1d8e/volumes" Mar 20 17:44:41 crc kubenswrapper[4795]: I0320 17:44:41.300635 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:44:41 crc kubenswrapper[4795]: I0320 17:44:41.301232 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:44:51 crc kubenswrapper[4795]: I0320 17:44:51.827675 4795 scope.go:117] "RemoveContainer" containerID="27cb2cc4ca0cf03af5e4f56a72a8901b4a28c70c5abb54e1f86d55c8053dcc74" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.170835 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 17:45:00 crc kubenswrapper[4795]: E0320 17:45:00.172483 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.172511 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.172766 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.173899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.176735 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.176757 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.191097 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.311043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.311476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.311534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.413382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.413542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.413595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.414810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.420446 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.432797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.518414 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.016417 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.604836 4795 generic.go:334] "Generic (PLEG): container finished" podID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerID="7e78ac608afa56e8111695b336413ee802aca06929422f0042e8a413df5d1f4a" exitCode=0 Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.604941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" event={"ID":"cd60241d-b207-4a9a-86b6-3be32ab282d3","Type":"ContainerDied","Data":"7e78ac608afa56e8111695b336413ee802aca06929422f0042e8a413df5d1f4a"} Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.605157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" event={"ID":"cd60241d-b207-4a9a-86b6-3be32ab282d3","Type":"ContainerStarted","Data":"81730fe9e4abf54b7e51797503af0b395a54f504426180c9e760735101e8ae5e"} Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.085860 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.269412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"cd60241d-b207-4a9a-86b6-3be32ab282d3\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.269736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"cd60241d-b207-4a9a-86b6-3be32ab282d3\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.269783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"cd60241d-b207-4a9a-86b6-3be32ab282d3\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.270461 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume" (OuterVolumeSpecName: "config-volume") pod "cd60241d-b207-4a9a-86b6-3be32ab282d3" (UID: "cd60241d-b207-4a9a-86b6-3be32ab282d3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.277455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cd60241d-b207-4a9a-86b6-3be32ab282d3" (UID: "cd60241d-b207-4a9a-86b6-3be32ab282d3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.283816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx" (OuterVolumeSpecName: "kube-api-access-llqvx") pod "cd60241d-b207-4a9a-86b6-3be32ab282d3" (UID: "cd60241d-b207-4a9a-86b6-3be32ab282d3"). InnerVolumeSpecName "kube-api-access-llqvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.371988 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.372126 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.372308 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.634046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" event={"ID":"cd60241d-b207-4a9a-86b6-3be32ab282d3","Type":"ContainerDied","Data":"81730fe9e4abf54b7e51797503af0b395a54f504426180c9e760735101e8ae5e"} Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.634105 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81730fe9e4abf54b7e51797503af0b395a54f504426180c9e760735101e8ae5e" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.634185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:11 crc kubenswrapper[4795]: I0320 17:45:11.300206 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:45:11 crc kubenswrapper[4795]: I0320 17:45:11.301083 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.300883 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.301638 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.301755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.303064 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.303191 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" gracePeriod=600 Mar 20 17:45:41 crc kubenswrapper[4795]: E0320 17:45:41.445476 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.050638 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" exitCode=0 Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.050786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48"} Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.050836 4795 scope.go:117] "RemoveContainer" containerID="c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1" Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.051428 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:45:42 crc kubenswrapper[4795]: E0320 17:45:42.051666 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.919108 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:45 crc kubenswrapper[4795]: E0320 17:45:45.920218 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerName="collect-profiles" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.920232 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerName="collect-profiles" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.920430 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerName="collect-profiles" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.924535 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.945067 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.075644 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.075838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.076037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178210 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.211932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.246298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.727323 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:47 crc kubenswrapper[4795]: I0320 17:45:47.097714 4795 generic.go:334] "Generic (PLEG): container finished" podID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerID="9c4ccb984fcea0c869be462be23965db4e7b09dccf48c35e2fcdcc062beca428" exitCode=0 Mar 20 17:45:47 crc kubenswrapper[4795]: I0320 17:45:47.097757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"9c4ccb984fcea0c869be462be23965db4e7b09dccf48c35e2fcdcc062beca428"} Mar 20 17:45:47 crc kubenswrapper[4795]: I0320 17:45:47.097797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerStarted","Data":"fac724dbd822f9dfabd9959c9d8174e232247e72a2949be62d1ad30b2c3b4ba9"} Mar 20 17:45:49 crc kubenswrapper[4795]: I0320 17:45:49.117876 4795 generic.go:334] "Generic (PLEG): container finished" podID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerID="ffd6f57295f980f22dcb8cb678472859d50f02ddbc01cd1b7e0bbaa6f1d24f8d" exitCode=0 Mar 20 17:45:49 crc kubenswrapper[4795]: I0320 17:45:49.117976 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"ffd6f57295f980f22dcb8cb678472859d50f02ddbc01cd1b7e0bbaa6f1d24f8d"} Mar 20 17:45:50 crc kubenswrapper[4795]: I0320 17:45:50.130602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerStarted","Data":"57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e"} Mar 20 17:45:55 crc kubenswrapper[4795]: I0320 17:45:55.255213 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:45:55 crc kubenswrapper[4795]: E0320 17:45:55.256321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.247384 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.247466 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.310766 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.335457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lqwpr" podStartSLOduration=8.708193489 podStartE2EDuration="11.335434024s" podCreationTimestamp="2026-03-20 17:45:45 +0000 UTC" firstStartedPulling="2026-03-20 17:45:47.100707401 +0000 UTC m=+1690.558738942" lastFinishedPulling="2026-03-20 17:45:49.727947926 +0000 UTC m=+1693.185979477" observedRunningTime="2026-03-20 17:45:50.157599398 +0000 UTC m=+1693.615630939" watchObservedRunningTime="2026-03-20 17:45:56.335434024 +0000 UTC m=+1699.793465575" Mar 20 17:45:57 crc kubenswrapper[4795]: I0320 17:45:57.265972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:57 crc kubenswrapper[4795]: I0320 17:45:57.332887 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:59 crc kubenswrapper[4795]: I0320 17:45:59.226340 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lqwpr" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" containerID="cri-o://57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e" gracePeriod=2 Mar 20 17:45:59 crc kubenswrapper[4795]: E0320 17:45:59.799448 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf700090f_cd79_4419_aeb1_7cf66ba3fcf5.slice/crio-conmon-57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.161221 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.166158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.173230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.173414 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.173597 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.178973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.238165 4795 generic.go:334] "Generic (PLEG): container finished" podID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerID="57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e" exitCode=0 Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.238214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e"} Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.343837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"auto-csr-approver-29567146-xzfkq\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.440230 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.445321 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"auto-csr-approver-29567146-xzfkq\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.469102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"auto-csr-approver-29567146-xzfkq\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.500800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.546410 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.546586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.546715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.547519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities" (OuterVolumeSpecName: "utilities") pod "f700090f-cd79-4419-aeb1-7cf66ba3fcf5" (UID: "f700090f-cd79-4419-aeb1-7cf66ba3fcf5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.549440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px" (OuterVolumeSpecName: "kube-api-access-292px") pod "f700090f-cd79-4419-aeb1-7cf66ba3fcf5" (UID: "f700090f-cd79-4419-aeb1-7cf66ba3fcf5"). InnerVolumeSpecName "kube-api-access-292px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.586833 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f700090f-cd79-4419-aeb1-7cf66ba3fcf5" (UID: "f700090f-cd79-4419-aeb1-7cf66ba3fcf5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.648882 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.648912 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.648923 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:00 crc kubenswrapper[4795]: W0320 17:46:00.979794 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ce1ddf5_f6e1_40ab_926d_4cf03d502e9c.slice/crio-3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372 WatchSource:0}: Error finding container 3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372: Status 404 returned error can't find the container with id 3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372 Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.980182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.280362 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.293599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerStarted","Data":"3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372"} Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.293662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"fac724dbd822f9dfabd9959c9d8174e232247e72a2949be62d1ad30b2c3b4ba9"} Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.293750 4795 scope.go:117] "RemoveContainer" containerID="57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e" Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.339608 4795 scope.go:117] "RemoveContainer" containerID="ffd6f57295f980f22dcb8cb678472859d50f02ddbc01cd1b7e0bbaa6f1d24f8d" Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.342794 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.366095 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.382577 4795 scope.go:117] "RemoveContainer" containerID="9c4ccb984fcea0c869be462be23965db4e7b09dccf48c35e2fcdcc062beca428" Mar 20 17:46:02 crc kubenswrapper[4795]: I0320 17:46:02.294975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerStarted","Data":"19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51"} Mar 20 17:46:02 crc kubenswrapper[4795]: I0320 17:46:02.317539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" podStartSLOduration=1.4386096959999999 podStartE2EDuration="2.31751409s" podCreationTimestamp="2026-03-20 17:46:00 +0000 UTC" firstStartedPulling="2026-03-20 17:46:00.983049483 +0000 UTC m=+1704.441081034" lastFinishedPulling="2026-03-20 17:46:01.861953887 +0000 UTC m=+1705.319985428" observedRunningTime="2026-03-20 17:46:02.308763486 +0000 UTC m=+1705.766795077" watchObservedRunningTime="2026-03-20 17:46:02.31751409 +0000 UTC m=+1705.775545631" Mar 20 17:46:03 crc kubenswrapper[4795]: I0320 17:46:03.270664 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" path="/var/lib/kubelet/pods/f700090f-cd79-4419-aeb1-7cf66ba3fcf5/volumes" Mar 20 17:46:03 crc kubenswrapper[4795]: I0320 17:46:03.307197 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerID="19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51" exitCode=0 Mar 20 17:46:03 crc kubenswrapper[4795]: I0320 17:46:03.307242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerDied","Data":"19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51"} Mar 20 17:46:04 crc kubenswrapper[4795]: I0320 17:46:04.790721 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:04 crc kubenswrapper[4795]: I0320 17:46:04.963673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " Mar 20 17:46:04 crc kubenswrapper[4795]: I0320 17:46:04.968310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g" (OuterVolumeSpecName: "kube-api-access-5s57g") pod "6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" (UID: "6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c"). InnerVolumeSpecName "kube-api-access-5s57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.065458 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.328854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerDied","Data":"3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372"} Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.328896 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.328952 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.398822 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.409146 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:46:07 crc kubenswrapper[4795]: I0320 17:46:07.270866 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" path="/var/lib/kubelet/pods/f9e6fe9e-d22e-420c-b050-a00a53749f1f/volumes" Mar 20 17:46:10 crc kubenswrapper[4795]: I0320 17:46:10.252541 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:10 crc kubenswrapper[4795]: E0320 17:46:10.253317 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:22 crc kubenswrapper[4795]: I0320 17:46:22.252559 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:22 crc kubenswrapper[4795]: E0320 17:46:22.253748 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:32 crc kubenswrapper[4795]: I0320 17:46:32.637603 4795 generic.go:334] "Generic (PLEG): container finished" podID="0708214e-e711-465a-a54e-97a462b2777e" containerID="c179fa5fa1c857ee73ee0d25264e475b241069dffa52757206eb76081b38cae9" exitCode=0 Mar 20 17:46:32 crc kubenswrapper[4795]: I0320 17:46:32.637741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerDied","Data":"c179fa5fa1c857ee73ee0d25264e475b241069dffa52757206eb76081b38cae9"} Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.212809 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.252357 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.252581 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264881 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.269662 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh" (OuterVolumeSpecName: "kube-api-access-kzxfh") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "kube-api-access-kzxfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.270906 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.291465 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory" (OuterVolumeSpecName: "inventory") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.320267 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372111 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372169 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372197 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372225 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.666593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerDied","Data":"8048e17374d65a5593e7f3026aacfa891127041d59875217add93959662e9cdc"} Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.666654 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8048e17374d65a5593e7f3026aacfa891127041d59875217add93959662e9cdc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.667030 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.787838 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k"] Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788274 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-content" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788296 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-content" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788315 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0708214e-e711-465a-a54e-97a462b2777e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788324 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0708214e-e711-465a-a54e-97a462b2777e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788364 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788384 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerName="oc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788394 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerName="oc" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788409 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-utilities" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788417 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-utilities" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788634 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0708214e-e711-465a-a54e-97a462b2777e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788657 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerName="oc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.789395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.791598 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.792134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.792704 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.793473 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.797636 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k"] Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.889088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.889310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.889363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.991255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.991650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.991787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.998308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.998492 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:35 crc kubenswrapper[4795]: I0320 17:46:35.025044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:35 crc kubenswrapper[4795]: I0320 17:46:35.122904 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:35 crc kubenswrapper[4795]: I0320 17:46:35.719383 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k"] Mar 20 17:46:36 crc kubenswrapper[4795]: I0320 17:46:36.690022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerStarted","Data":"332d037bd56e3c7a734160d494102af123e05d9ad0b58769a38e86095019dcc7"} Mar 20 17:46:36 crc kubenswrapper[4795]: I0320 17:46:36.690350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerStarted","Data":"6c07a2c08823474a88320c9dd611604352c7622a34255df23ca241659b07db6b"} Mar 20 17:46:36 crc kubenswrapper[4795]: I0320 17:46:36.716903 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" podStartSLOduration=2.192368864 podStartE2EDuration="2.716879227s" podCreationTimestamp="2026-03-20 17:46:34 +0000 UTC" firstStartedPulling="2026-03-20 17:46:35.714667032 +0000 UTC m=+1739.172698583" lastFinishedPulling="2026-03-20 17:46:36.239177395 +0000 UTC m=+1739.697208946" observedRunningTime="2026-03-20 17:46:36.70865215 +0000 UTC m=+1740.166683701" watchObservedRunningTime="2026-03-20 17:46:36.716879227 +0000 UTC m=+1740.174910778" Mar 20 17:46:47 crc kubenswrapper[4795]: I0320 17:46:47.266220 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:47 crc kubenswrapper[4795]: E0320 17:46:47.267201 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:51 crc kubenswrapper[4795]: I0320 17:46:51.966387 4795 scope.go:117] "RemoveContainer" containerID="8ac90f4263985d5a19d6f00ac01d70eb81681c5a298c4e2c5302052e573286a6" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.065248 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.084829 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.097854 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.108595 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.117573 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.127567 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.136316 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.145963 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.153763 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.161487 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.169671 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.180745 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.266405 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" path="/var/lib/kubelet/pods/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.267402 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" path="/var/lib/kubelet/pods/389c1f10-5aba-4c4d-b0b3-3a38f6038536/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.268195 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" path="/var/lib/kubelet/pods/6aac28d5-6b58-424e-83f8-ec71c53e41ce/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.269057 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" path="/var/lib/kubelet/pods/acfb1ea8-a8d2-4152-ad18-54d380b289c4/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.271058 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" path="/var/lib/kubelet/pods/c9265b8c-0b80-47d9-8f4b-3d996233341e/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.272499 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" path="/var/lib/kubelet/pods/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc/volumes" Mar 20 17:46:59 crc kubenswrapper[4795]: I0320 17:46:59.253121 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:59 crc kubenswrapper[4795]: E0320 17:46:59.253622 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:12 crc kubenswrapper[4795]: I0320 17:47:12.252755 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:12 crc kubenswrapper[4795]: E0320 17:47:12.253876 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:18 crc kubenswrapper[4795]: I0320 17:47:18.079922 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:47:18 crc kubenswrapper[4795]: I0320 17:47:18.101170 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:47:19 crc kubenswrapper[4795]: I0320 17:47:19.275080 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" path="/var/lib/kubelet/pods/fc63f125-2d90-43df-a863-b85fb2eb690e/volumes" Mar 20 17:47:25 crc kubenswrapper[4795]: I0320 17:47:25.042838 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:47:25 crc kubenswrapper[4795]: I0320 17:47:25.057263 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:47:25 crc kubenswrapper[4795]: I0320 17:47:25.266031 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" path="/var/lib/kubelet/pods/e951c331-872c-41b6-b747-d5129b8c0a1b/volumes" Mar 20 17:47:27 crc kubenswrapper[4795]: I0320 17:47:27.286925 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:27 crc kubenswrapper[4795]: E0320 17:47:27.288115 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:38 crc kubenswrapper[4795]: I0320 17:47:38.252240 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:38 crc kubenswrapper[4795]: E0320 17:47:38.255958 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.069891 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.086051 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.097873 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.107894 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.271785 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" path="/var/lib/kubelet/pods/9ff9ec79-6bd9-470e-8a75-8df1f3c52851/volumes" Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.273089 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" path="/var/lib/kubelet/pods/e065e2d4-096b-426b-a1f8-14311adb7cbc/volumes" Mar 20 17:47:42 crc kubenswrapper[4795]: I0320 17:47:42.025492 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:47:42 crc kubenswrapper[4795]: I0320 17:47:42.031840 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:47:43 crc kubenswrapper[4795]: I0320 17:47:43.274765 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" path="/var/lib/kubelet/pods/373ddf98-d9da-4f1f-a6be-3d16e3cbad57/volumes" Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.050953 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.066925 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.076855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.086524 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.042773 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.057512 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.270208 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" path="/var/lib/kubelet/pods/18b1c5f0-e7fb-44b7-8c75-c8036f371c56/volumes" Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.271615 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" path="/var/lib/kubelet/pods/1c0847a4-54b5-4068-bfa8-730a19e96d9c/volumes" Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.272919 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d71698-4dc2-448a-9330-23372e2d508b" path="/var/lib/kubelet/pods/36d71698-4dc2-448a-9330-23372e2d508b/volumes" Mar 20 17:47:49 crc kubenswrapper[4795]: I0320 17:47:49.034862 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:47:49 crc kubenswrapper[4795]: I0320 17:47:49.041599 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:47:49 crc kubenswrapper[4795]: I0320 17:47:49.270956 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" path="/var/lib/kubelet/pods/7b15c724-622b-4da7-96a3-01949d04ecac/volumes" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.110965 4795 scope.go:117] "RemoveContainer" containerID="3e3072d7a6a60ff440da8ec24082885e62958e6ce5ded9fd9910a3d0c2817a07" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.168748 4795 scope.go:117] "RemoveContainer" containerID="1fe0f6a7ba267ec0588c7d4179b78569ace69ae42af4f4ce02a9e28bfc87aa93" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.235398 4795 scope.go:117] "RemoveContainer" containerID="0bd7daab0116804ff4450e365b81c7b208a20da2cd4b665ce83729724da32638" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.298079 4795 scope.go:117] "RemoveContainer" containerID="497d569160d86f7ef365c3f9c537432bd00933f71438ea39707377d46eebd046" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.324754 4795 scope.go:117] "RemoveContainer" containerID="fa7f816c765d44ed743198c38348dd663b04f7cfc3b7f6aac5dffa2623d4db45" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.365291 4795 scope.go:117] "RemoveContainer" containerID="470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.418631 4795 scope.go:117] "RemoveContainer" containerID="115e8aa5e635a588311da0792150e7730feaab865eb0acb01117eb70b42bfde3" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.437660 4795 scope.go:117] "RemoveContainer" containerID="d85d02558764daff4d2300daa1f7a51dd79d0b89452cbb1821643bd1f3d0ff3c" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.458549 4795 scope.go:117] "RemoveContainer" containerID="0f5ef18005b655abcc8e4883b9bee8538648f3cf86fe68a6e17cb1ecb194c52e" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.491574 4795 scope.go:117] "RemoveContainer" containerID="2d63356a6d0232331bb76203b1359e46e0f2a21a5ebc5f3160865388f8cf9a1c" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.520210 4795 scope.go:117] "RemoveContainer" containerID="e0dfbaccbeeb5b8f99fb5498e364810e7c89661123cf8b487af69c7d6020134e" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.552276 4795 scope.go:117] "RemoveContainer" containerID="c288b4ff895d130555ade7ce513d591493310d7fb3678ee47968d204fe11297a" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.579860 4795 scope.go:117] "RemoveContainer" containerID="5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.633369 4795 scope.go:117] "RemoveContainer" containerID="5d693b98a616da996bc733e3508b576b31f68d8eb1c9fc7b9800283fac04b343" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.666763 4795 scope.go:117] "RemoveContainer" containerID="61c0a7747547de21c917366527d52306c78b302a64918960d0e832416be0ca0f" Mar 20 17:47:53 crc kubenswrapper[4795]: I0320 17:47:53.253404 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:53 crc kubenswrapper[4795]: E0320 17:47:53.254326 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.175409 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.178776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.184940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.189433 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.190334 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.213659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"auto-csr-approver-29567148-nqw6d\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.242105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.315756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"auto-csr-approver-29567148-nqw6d\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.344587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"auto-csr-approver-29567148-nqw6d\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.524646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:01 crc kubenswrapper[4795]: I0320 17:48:01.023594 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:48:01 crc kubenswrapper[4795]: I0320 17:48:01.724588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" event={"ID":"5541d8b2-57fb-4162-8ee0-ac6630a5d91c","Type":"ContainerStarted","Data":"4c481ef9a098bd5699116bb215caa2af24c09ad9a40e254d4b9b5b9ecdd1f8d5"} Mar 20 17:48:03 crc kubenswrapper[4795]: I0320 17:48:03.750247 4795 generic.go:334] "Generic (PLEG): container finished" podID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerID="29ff925c9ace295b0e664d8ded17085588e84c04311df585ee77cb9a00150d0d" exitCode=0 Mar 20 17:48:03 crc kubenswrapper[4795]: I0320 17:48:03.750346 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" event={"ID":"5541d8b2-57fb-4162-8ee0-ac6630a5d91c","Type":"ContainerDied","Data":"29ff925c9ace295b0e664d8ded17085588e84c04311df585ee77cb9a00150d0d"} Mar 20 17:48:04 crc kubenswrapper[4795]: I0320 17:48:04.252806 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:04 crc kubenswrapper[4795]: E0320 17:48:04.253283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.079064 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.146484 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.153529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2" (OuterVolumeSpecName: "kube-api-access-shfp2") pod "5541d8b2-57fb-4162-8ee0-ac6630a5d91c" (UID: "5541d8b2-57fb-4162-8ee0-ac6630a5d91c"). InnerVolumeSpecName "kube-api-access-shfp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.248231 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.781477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" event={"ID":"5541d8b2-57fb-4162-8ee0-ac6630a5d91c","Type":"ContainerDied","Data":"4c481ef9a098bd5699116bb215caa2af24c09ad9a40e254d4b9b5b9ecdd1f8d5"} Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.781537 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c481ef9a098bd5699116bb215caa2af24c09ad9a40e254d4b9b5b9ecdd1f8d5" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.781536 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:06 crc kubenswrapper[4795]: I0320 17:48:06.154785 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:48:06 crc kubenswrapper[4795]: I0320 17:48:06.165367 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:48:07 crc kubenswrapper[4795]: I0320 17:48:07.280670 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" path="/var/lib/kubelet/pods/df931d18-2dae-408e-823d-45c28b0a31c2/volumes" Mar 20 17:48:12 crc kubenswrapper[4795]: I0320 17:48:12.863833 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerID="332d037bd56e3c7a734160d494102af123e05d9ad0b58769a38e86095019dcc7" exitCode=0 Mar 20 17:48:12 crc kubenswrapper[4795]: I0320 17:48:12.864532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerDied","Data":"332d037bd56e3c7a734160d494102af123e05d9ad0b58769a38e86095019dcc7"} Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.459006 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.550818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.550976 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.551155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.557852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp" (OuterVolumeSpecName: "kube-api-access-pw6mp") pod "b0af5324-4ba3-4a12-9fdb-b467918ba19d" (UID: "b0af5324-4ba3-4a12-9fdb-b467918ba19d"). InnerVolumeSpecName "kube-api-access-pw6mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.577757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0af5324-4ba3-4a12-9fdb-b467918ba19d" (UID: "b0af5324-4ba3-4a12-9fdb-b467918ba19d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.595389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory" (OuterVolumeSpecName: "inventory") pod "b0af5324-4ba3-4a12-9fdb-b467918ba19d" (UID: "b0af5324-4ba3-4a12-9fdb-b467918ba19d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.654610 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.654647 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.654664 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.895109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerDied","Data":"6c07a2c08823474a88320c9dd611604352c7622a34255df23ca241659b07db6b"} Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.895168 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c07a2c08823474a88320c9dd611604352c7622a34255df23ca241659b07db6b" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.895270 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.005115 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9"] Mar 20 17:48:15 crc kubenswrapper[4795]: E0320 17:48:15.005678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerName="oc" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.005798 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerName="oc" Mar 20 17:48:15 crc kubenswrapper[4795]: E0320 17:48:15.005830 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.005845 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.006121 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerName="oc" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.006176 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.007103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.009413 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.011574 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.011678 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.012794 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.017592 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9"] Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.062936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.062994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.063027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.165189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.165262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.165301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.170185 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.177195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.194239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.334600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.917069 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9"] Mar 20 17:48:16 crc kubenswrapper[4795]: I0320 17:48:16.917338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerStarted","Data":"3ffae966dde36b17c8819bbe3306ed7854ac13b87b1dac795be893a2b620fea2"} Mar 20 17:48:16 crc kubenswrapper[4795]: I0320 17:48:16.917707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerStarted","Data":"4221084fbf63010ad9d50198b40243bcd607b968b80e4bc2aa8911a43dc05dbe"} Mar 20 17:48:16 crc kubenswrapper[4795]: I0320 17:48:16.943400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" podStartSLOduration=2.488752527 podStartE2EDuration="2.943369474s" podCreationTimestamp="2026-03-20 17:48:14 +0000 UTC" firstStartedPulling="2026-03-20 17:48:15.918451829 +0000 UTC m=+1839.376483370" lastFinishedPulling="2026-03-20 17:48:16.373068736 +0000 UTC m=+1839.831100317" observedRunningTime="2026-03-20 17:48:16.934402255 +0000 UTC m=+1840.392433826" watchObservedRunningTime="2026-03-20 17:48:16.943369474 +0000 UTC m=+1840.401401055" Mar 20 17:48:17 crc kubenswrapper[4795]: I0320 17:48:17.266174 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:17 crc kubenswrapper[4795]: E0320 17:48:17.266804 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:27 crc kubenswrapper[4795]: I0320 17:48:27.050539 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:48:27 crc kubenswrapper[4795]: I0320 17:48:27.060374 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:48:27 crc kubenswrapper[4795]: I0320 17:48:27.271948 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4244f6d6-536a-4555-a05b-176d696d427d" path="/var/lib/kubelet/pods/4244f6d6-536a-4555-a05b-176d696d427d/volumes" Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.049472 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.066200 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.077808 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.087000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.266570 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37537245-d57e-4087-ade6-6c028eb4d137" path="/var/lib/kubelet/pods/37537245-d57e-4087-ade6-6c028eb4d137/volumes" Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.267193 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" path="/var/lib/kubelet/pods/78238b29-6bdd-4f77-847e-731c6c785ed9/volumes" Mar 20 17:48:31 crc kubenswrapper[4795]: I0320 17:48:31.252858 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:31 crc kubenswrapper[4795]: E0320 17:48:31.254000 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.035890 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.043297 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.052160 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.059622 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.253151 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:42 crc kubenswrapper[4795]: E0320 17:48:42.253681 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:43 crc kubenswrapper[4795]: I0320 17:48:43.271875 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" path="/var/lib/kubelet/pods/706c47a0-7763-44af-9b14-0e5322a8f2f1/volumes" Mar 20 17:48:43 crc kubenswrapper[4795]: I0320 17:48:43.273177 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d254abd5-b344-416a-b99d-96737388795e" path="/var/lib/kubelet/pods/d254abd5-b344-416a-b99d-96737388795e/volumes" Mar 20 17:48:52 crc kubenswrapper[4795]: I0320 17:48:52.976420 4795 scope.go:117] "RemoveContainer" containerID="43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.040311 4795 scope.go:117] "RemoveContainer" containerID="8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.118201 4795 scope.go:117] "RemoveContainer" containerID="ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.169207 4795 scope.go:117] "RemoveContainer" containerID="8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.216671 4795 scope.go:117] "RemoveContainer" containerID="20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.271092 4795 scope.go:117] "RemoveContainer" containerID="b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78" Mar 20 17:48:54 crc kubenswrapper[4795]: I0320 17:48:54.252524 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:54 crc kubenswrapper[4795]: E0320 17:48:54.253052 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:07 crc kubenswrapper[4795]: I0320 17:49:07.263457 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:07 crc kubenswrapper[4795]: E0320 17:49:07.264556 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:20 crc kubenswrapper[4795]: I0320 17:49:20.252391 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:20 crc kubenswrapper[4795]: E0320 17:49:20.253112 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.080312 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.094837 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.104026 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.113838 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.124994 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.131756 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.289948 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" path="/var/lib/kubelet/pods/7d1dfe60-98b0-4644-b063-831293f9bd5c/volumes" Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.291293 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42b654e-f003-45dd-a7c4-07655514643e" path="/var/lib/kubelet/pods/e42b654e-f003-45dd-a7c4-07655514643e/volumes" Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.292128 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" path="/var/lib/kubelet/pods/efc90399-0b15-4fc6-b441-d7df6925c8aa/volumes" Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.045978 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.065642 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.076835 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.086263 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.094293 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.105321 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:49:23 crc kubenswrapper[4795]: I0320 17:49:23.271389 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" path="/var/lib/kubelet/pods/65a51797-b6d0-4b5b-9927-54d4b965469e/volumes" Mar 20 17:49:23 crc kubenswrapper[4795]: I0320 17:49:23.272881 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" path="/var/lib/kubelet/pods/7bd6f80f-7908-42b5-b32a-63d585bd9194/volumes" Mar 20 17:49:23 crc kubenswrapper[4795]: I0320 17:49:23.274507 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" path="/var/lib/kubelet/pods/9f5daae9-920d-496a-ad6a-c016cfb82250/volumes" Mar 20 17:49:34 crc kubenswrapper[4795]: I0320 17:49:34.746990 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerID="3ffae966dde36b17c8819bbe3306ed7854ac13b87b1dac795be893a2b620fea2" exitCode=0 Mar 20 17:49:34 crc kubenswrapper[4795]: I0320 17:49:34.747053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerDied","Data":"3ffae966dde36b17c8819bbe3306ed7854ac13b87b1dac795be893a2b620fea2"} Mar 20 17:49:35 crc kubenswrapper[4795]: I0320 17:49:35.253312 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:35 crc kubenswrapper[4795]: E0320 17:49:35.253793 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.291787 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.456581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.456853 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.456997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.470875 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr" (OuterVolumeSpecName: "kube-api-access-545pr") pod "2bad20c9-d77a-4c30-8fa2-979c05697cf4" (UID: "2bad20c9-d77a-4c30-8fa2-979c05697cf4"). InnerVolumeSpecName "kube-api-access-545pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.487714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory" (OuterVolumeSpecName: "inventory") pod "2bad20c9-d77a-4c30-8fa2-979c05697cf4" (UID: "2bad20c9-d77a-4c30-8fa2-979c05697cf4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.494462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2bad20c9-d77a-4c30-8fa2-979c05697cf4" (UID: "2bad20c9-d77a-4c30-8fa2-979c05697cf4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.560881 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.560945 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.560965 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.775633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerDied","Data":"4221084fbf63010ad9d50198b40243bcd607b968b80e4bc2aa8911a43dc05dbe"} Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.775725 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4221084fbf63010ad9d50198b40243bcd607b968b80e4bc2aa8911a43dc05dbe" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.775756 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.955968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5"] Mar 20 17:49:36 crc kubenswrapper[4795]: E0320 17:49:36.956567 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.956599 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.957073 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.959050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.964241 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.964843 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.964851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.965053 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.978534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5"] Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.984900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.984981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.985130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.087757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.088096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.088172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.093571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.094362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.115867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.307971 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.316574 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.698715 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.712884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5"] Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.788413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerStarted","Data":"514ef5f301898c3756907819ae9e98ff22f8aada603fcffcf7334f351de655ef"} Mar 20 17:49:38 crc kubenswrapper[4795]: I0320 17:49:38.125879 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:49:38 crc kubenswrapper[4795]: I0320 17:49:38.805617 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerStarted","Data":"8e9a482ad1e6bf23da2ae93a35021c118d8ae4b279a7e4f08b403502cf6ed589"} Mar 20 17:49:38 crc kubenswrapper[4795]: I0320 17:49:38.833481 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" podStartSLOduration=2.411258512 podStartE2EDuration="2.833461452s" podCreationTimestamp="2026-03-20 17:49:36 +0000 UTC" firstStartedPulling="2026-03-20 17:49:37.698401847 +0000 UTC m=+1921.156433378" lastFinishedPulling="2026-03-20 17:49:38.120604747 +0000 UTC m=+1921.578636318" observedRunningTime="2026-03-20 17:49:38.827230228 +0000 UTC m=+1922.285261769" watchObservedRunningTime="2026-03-20 17:49:38.833461452 +0000 UTC m=+1922.291493003" Mar 20 17:49:43 crc kubenswrapper[4795]: I0320 17:49:43.859538 4795 generic.go:334] "Generic (PLEG): container finished" podID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerID="8e9a482ad1e6bf23da2ae93a35021c118d8ae4b279a7e4f08b403502cf6ed589" exitCode=0 Mar 20 17:49:43 crc kubenswrapper[4795]: I0320 17:49:43.859592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerDied","Data":"8e9a482ad1e6bf23da2ae93a35021c118d8ae4b279a7e4f08b403502cf6ed589"} Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.358427 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.556796 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"35b4aa82-d668-474b-b54d-b540190f5a6c\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.557029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"35b4aa82-d668-474b-b54d-b540190f5a6c\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.557070 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"35b4aa82-d668-474b-b54d-b540190f5a6c\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.563137 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb" (OuterVolumeSpecName: "kube-api-access-xzgcb") pod "35b4aa82-d668-474b-b54d-b540190f5a6c" (UID: "35b4aa82-d668-474b-b54d-b540190f5a6c"). InnerVolumeSpecName "kube-api-access-xzgcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.587664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory" (OuterVolumeSpecName: "inventory") pod "35b4aa82-d668-474b-b54d-b540190f5a6c" (UID: "35b4aa82-d668-474b-b54d-b540190f5a6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.606607 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35b4aa82-d668-474b-b54d-b540190f5a6c" (UID: "35b4aa82-d668-474b-b54d-b540190f5a6c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.660205 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.660258 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.660283 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.884091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerDied","Data":"514ef5f301898c3756907819ae9e98ff22f8aada603fcffcf7334f351de655ef"} Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.884125 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514ef5f301898c3756907819ae9e98ff22f8aada603fcffcf7334f351de655ef" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.884188 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.988697 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55"] Mar 20 17:49:45 crc kubenswrapper[4795]: E0320 17:49:45.989245 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.989260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.989457 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.990018 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.993842 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.995585 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.996032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.997543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.009488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55"] Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.168416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.168626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.168914 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.253004 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:46 crc kubenswrapper[4795]: E0320 17:49:46.253425 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.271514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.271914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.272722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.276316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.276991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.293041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.327483 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.918550 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55"] Mar 20 17:49:46 crc kubenswrapper[4795]: W0320 17:49:46.925359 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20b330a0_830c_419e_81fe_a36dd1a32cc2.slice/crio-ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110 WatchSource:0}: Error finding container ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110: Status 404 returned error can't find the container with id ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110 Mar 20 17:49:47 crc kubenswrapper[4795]: I0320 17:49:47.906271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerStarted","Data":"902ad09d05a718a4dc819827f582612e00fd8e9ea310fb5d70c495ae4bd899ea"} Mar 20 17:49:47 crc kubenswrapper[4795]: I0320 17:49:47.906756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerStarted","Data":"ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110"} Mar 20 17:49:47 crc kubenswrapper[4795]: I0320 17:49:47.939020 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" podStartSLOduration=2.555337413 podStartE2EDuration="2.938995435s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:46.928642372 +0000 UTC m=+1930.386673923" lastFinishedPulling="2026-03-20 17:49:47.312300414 +0000 UTC m=+1930.770331945" observedRunningTime="2026-03-20 17:49:47.931063438 +0000 UTC m=+1931.389095019" watchObservedRunningTime="2026-03-20 17:49:47.938995435 +0000 UTC m=+1931.397027016" Mar 20 17:49:51 crc kubenswrapper[4795]: I0320 17:49:51.070257 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:49:51 crc kubenswrapper[4795]: I0320 17:49:51.076718 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:49:51 crc kubenswrapper[4795]: I0320 17:49:51.268875 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" path="/var/lib/kubelet/pods/02a8b32b-fab3-401f-b667-592c8840bd97/volumes" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.437648 4795 scope.go:117] "RemoveContainer" containerID="181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.500586 4795 scope.go:117] "RemoveContainer" containerID="1a5ed57a211fe9b0c1882f91516bfc8da29711316e391c1c87ed18df2cb6cc36" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.542832 4795 scope.go:117] "RemoveContainer" containerID="f4b5007e4a1309d08572b5c31e2719d5a9d1e8abc1f29797304920c21729de14" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.595282 4795 scope.go:117] "RemoveContainer" containerID="21aac3ceb6dfd938908085675b810b1f95e9aaa0d7afc715430454788951ca0a" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.640026 4795 scope.go:117] "RemoveContainer" containerID="d5e2a993de0ac2a73d513cdc5305eaa4c6be7243356c29c7534462e994b17675" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.670086 4795 scope.go:117] "RemoveContainer" containerID="b63d10a82b890eac4b6bd4726e08b48ae844a0390be4307858d97c75a41d914f" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.709645 4795 scope.go:117] "RemoveContainer" containerID="43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951" Mar 20 17:49:59 crc kubenswrapper[4795]: I0320 17:49:59.255487 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:59 crc kubenswrapper[4795]: E0320 17:49:59.256835 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.171833 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.188480 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.188609 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.192655 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.192929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.192665 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.351453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"auto-csr-approver-29567150-glzlx\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.454179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"auto-csr-approver-29567150-glzlx\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.478659 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"auto-csr-approver-29567150-glzlx\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.533405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:01 crc kubenswrapper[4795]: W0320 17:50:01.048162 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e628e06_ee48_4969_824e_fba400b67d3a.slice/crio-8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2 WatchSource:0}: Error finding container 8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2: Status 404 returned error can't find the container with id 8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2 Mar 20 17:50:01 crc kubenswrapper[4795]: I0320 17:50:01.063024 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:50:02 crc kubenswrapper[4795]: I0320 17:50:02.062940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-glzlx" event={"ID":"5e628e06-ee48-4969-824e-fba400b67d3a","Type":"ContainerStarted","Data":"8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2"} Mar 20 17:50:03 crc kubenswrapper[4795]: I0320 17:50:03.071573 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e628e06-ee48-4969-824e-fba400b67d3a" containerID="b0e8f1ce702c9e1cfb11740285e904a1e8d1f711ef3e97850efbb6236da59523" exitCode=0 Mar 20 17:50:03 crc kubenswrapper[4795]: I0320 17:50:03.073464 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-glzlx" event={"ID":"5e628e06-ee48-4969-824e-fba400b67d3a","Type":"ContainerDied","Data":"b0e8f1ce702c9e1cfb11740285e904a1e8d1f711ef3e97850efbb6236da59523"} Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.492905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.668758 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"5e628e06-ee48-4969-824e-fba400b67d3a\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.676962 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc" (OuterVolumeSpecName: "kube-api-access-t4bzc") pod "5e628e06-ee48-4969-824e-fba400b67d3a" (UID: "5e628e06-ee48-4969-824e-fba400b67d3a"). InnerVolumeSpecName "kube-api-access-t4bzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.771325 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.102037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-glzlx" event={"ID":"5e628e06-ee48-4969-824e-fba400b67d3a","Type":"ContainerDied","Data":"8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2"} Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.102109 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2" Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.102785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.563983 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.570495 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:50:07 crc kubenswrapper[4795]: I0320 17:50:07.281543 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" path="/var/lib/kubelet/pods/a6396cd8-bc19-4f24-ae36-12356bfa8133/volumes" Mar 20 17:50:12 crc kubenswrapper[4795]: I0320 17:50:12.048990 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:50:12 crc kubenswrapper[4795]: I0320 17:50:12.062764 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:50:12 crc kubenswrapper[4795]: I0320 17:50:12.252512 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:12 crc kubenswrapper[4795]: E0320 17:50:12.252805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:13 crc kubenswrapper[4795]: I0320 17:50:13.292195 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18794d5c-e43a-44dc-9510-763a31275104" path="/var/lib/kubelet/pods/18794d5c-e43a-44dc-9510-763a31275104/volumes" Mar 20 17:50:14 crc kubenswrapper[4795]: I0320 17:50:14.045378 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:50:14 crc kubenswrapper[4795]: I0320 17:50:14.057392 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:50:15 crc kubenswrapper[4795]: I0320 17:50:15.273466 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" path="/var/lib/kubelet/pods/4f61db3a-a7de-495d-8305-b9e2910415e2/volumes" Mar 20 17:50:24 crc kubenswrapper[4795]: I0320 17:50:24.252847 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:24 crc kubenswrapper[4795]: E0320 17:50:24.253723 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:36 crc kubenswrapper[4795]: I0320 17:50:36.478675 4795 generic.go:334] "Generic (PLEG): container finished" podID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerID="902ad09d05a718a4dc819827f582612e00fd8e9ea310fb5d70c495ae4bd899ea" exitCode=0 Mar 20 17:50:36 crc kubenswrapper[4795]: I0320 17:50:36.478720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerDied","Data":"902ad09d05a718a4dc819827f582612e00fd8e9ea310fb5d70c495ae4bd899ea"} Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.032983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.182461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"20b330a0-830c-419e-81fe-a36dd1a32cc2\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.182616 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"20b330a0-830c-419e-81fe-a36dd1a32cc2\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.182793 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"20b330a0-830c-419e-81fe-a36dd1a32cc2\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.191791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76" (OuterVolumeSpecName: "kube-api-access-8gt76") pod "20b330a0-830c-419e-81fe-a36dd1a32cc2" (UID: "20b330a0-830c-419e-81fe-a36dd1a32cc2"). InnerVolumeSpecName "kube-api-access-8gt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.248145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory" (OuterVolumeSpecName: "inventory") pod "20b330a0-830c-419e-81fe-a36dd1a32cc2" (UID: "20b330a0-830c-419e-81fe-a36dd1a32cc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.248843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20b330a0-830c-419e-81fe-a36dd1a32cc2" (UID: "20b330a0-830c-419e-81fe-a36dd1a32cc2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.285456 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.285494 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.285508 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.506236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerDied","Data":"ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110"} Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.506281 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.506298 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708092 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm"] Mar 20 17:50:38 crc kubenswrapper[4795]: E0320 17:50:38.708565 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" containerName="oc" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708582 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" containerName="oc" Mar 20 17:50:38 crc kubenswrapper[4795]: E0320 17:50:38.708594 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708601 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708880 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708896 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" containerName="oc" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.709447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.711213 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.711425 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.711737 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.713550 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.719893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm"] Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.897763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.897832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.897997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.000416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.000488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.000586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.004167 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.011480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.021802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.025812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.254464 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:39 crc kubenswrapper[4795]: E0320 17:50:39.255574 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.647009 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm"] Mar 20 17:50:39 crc kubenswrapper[4795]: W0320 17:50:39.654134 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d666090_1065_4b2d_9ac6_b84776b53d0a.slice/crio-87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39 WatchSource:0}: Error finding container 87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39: Status 404 returned error can't find the container with id 87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39 Mar 20 17:50:40 crc kubenswrapper[4795]: I0320 17:50:40.530324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerStarted","Data":"894408f7a6f3e593d4f36352df00c09efefbfa26446049a6cd6ff9f1cbfa90a4"} Mar 20 17:50:40 crc kubenswrapper[4795]: I0320 17:50:40.530648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerStarted","Data":"87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39"} Mar 20 17:50:40 crc kubenswrapper[4795]: I0320 17:50:40.557076 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" podStartSLOduration=2.034467904 podStartE2EDuration="2.55704474s" podCreationTimestamp="2026-03-20 17:50:38 +0000 UTC" firstStartedPulling="2026-03-20 17:50:39.656740474 +0000 UTC m=+1983.114772025" lastFinishedPulling="2026-03-20 17:50:40.17931728 +0000 UTC m=+1983.637348861" observedRunningTime="2026-03-20 17:50:40.552637433 +0000 UTC m=+1984.010669004" watchObservedRunningTime="2026-03-20 17:50:40.55704474 +0000 UTC m=+1984.015076321" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.252816 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.673369 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5"} Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.884309 4795 scope.go:117] "RemoveContainer" containerID="2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.950909 4795 scope.go:117] "RemoveContainer" containerID="de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.985492 4795 scope.go:117] "RemoveContainer" containerID="29087c37b0e22594df358a498bb26205f2050bb1e4a607372b3a2ba3b4df8dd7" Mar 20 17:50:59 crc kubenswrapper[4795]: I0320 17:50:59.067113 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:50:59 crc kubenswrapper[4795]: I0320 17:50:59.082342 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:50:59 crc kubenswrapper[4795]: I0320 17:50:59.271781 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" path="/var/lib/kubelet/pods/fbc8602c-1f19-4825-b3e5-32d643f12430/volumes" Mar 20 17:51:33 crc kubenswrapper[4795]: I0320 17:51:33.117490 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerID="894408f7a6f3e593d4f36352df00c09efefbfa26446049a6cd6ff9f1cbfa90a4" exitCode=0 Mar 20 17:51:33 crc kubenswrapper[4795]: I0320 17:51:33.117594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerDied","Data":"894408f7a6f3e593d4f36352df00c09efefbfa26446049a6cd6ff9f1cbfa90a4"} Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.597789 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.760245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"3d666090-1065-4b2d-9ac6-b84776b53d0a\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.760639 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"3d666090-1065-4b2d-9ac6-b84776b53d0a\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.760777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"3d666090-1065-4b2d-9ac6-b84776b53d0a\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.768038 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz" (OuterVolumeSpecName: "kube-api-access-mwftz") pod "3d666090-1065-4b2d-9ac6-b84776b53d0a" (UID: "3d666090-1065-4b2d-9ac6-b84776b53d0a"). InnerVolumeSpecName "kube-api-access-mwftz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.813487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d666090-1065-4b2d-9ac6-b84776b53d0a" (UID: "3d666090-1065-4b2d-9ac6-b84776b53d0a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.816095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory" (OuterVolumeSpecName: "inventory") pod "3d666090-1065-4b2d-9ac6-b84776b53d0a" (UID: "3d666090-1065-4b2d-9ac6-b84776b53d0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.864521 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.864598 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.864627 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.148642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerDied","Data":"87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39"} Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.148747 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.148842 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.434369 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j6rls"] Mar 20 17:51:35 crc kubenswrapper[4795]: E0320 17:51:35.435215 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.435258 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.435631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.436792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.440670 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.441784 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.443871 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.444102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.453659 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j6rls"] Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.582734 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.582849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.583016 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.684870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.685134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.685232 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.693606 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.699452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.715371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.766233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:36 crc kubenswrapper[4795]: I0320 17:51:36.354432 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j6rls"] Mar 20 17:51:37 crc kubenswrapper[4795]: I0320 17:51:37.172530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerStarted","Data":"a299166917e0a539fa06dd999e0055403dbb7b59d130347d0ae95dd5287f90cf"} Mar 20 17:51:37 crc kubenswrapper[4795]: I0320 17:51:37.172876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerStarted","Data":"02964fff73e3665f58b601e9cc6da3da341c2ab0c31c859cfed6fac2de7b5310"} Mar 20 17:51:37 crc kubenswrapper[4795]: I0320 17:51:37.212930 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" podStartSLOduration=1.75360969 podStartE2EDuration="2.212903578s" podCreationTimestamp="2026-03-20 17:51:35 +0000 UTC" firstStartedPulling="2026-03-20 17:51:36.365134188 +0000 UTC m=+2039.823165729" lastFinishedPulling="2026-03-20 17:51:36.824428036 +0000 UTC m=+2040.282459617" observedRunningTime="2026-03-20 17:51:37.197423027 +0000 UTC m=+2040.655454578" watchObservedRunningTime="2026-03-20 17:51:37.212903578 +0000 UTC m=+2040.670935149" Mar 20 17:51:44 crc kubenswrapper[4795]: I0320 17:51:44.249316 4795 generic.go:334] "Generic (PLEG): container finished" podID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerID="a299166917e0a539fa06dd999e0055403dbb7b59d130347d0ae95dd5287f90cf" exitCode=0 Mar 20 17:51:44 crc kubenswrapper[4795]: I0320 17:51:44.249486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerDied","Data":"a299166917e0a539fa06dd999e0055403dbb7b59d130347d0ae95dd5287f90cf"} Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.698524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.745526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.745576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.745753 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.754971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb" (OuterVolumeSpecName: "kube-api-access-sjzxb") pod "80cf5a83-936d-4789-a7bc-b91cdb0e564d" (UID: "80cf5a83-936d-4789-a7bc-b91cdb0e564d"). InnerVolumeSpecName "kube-api-access-sjzxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.783132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "80cf5a83-936d-4789-a7bc-b91cdb0e564d" (UID: "80cf5a83-936d-4789-a7bc-b91cdb0e564d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.799365 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80cf5a83-936d-4789-a7bc-b91cdb0e564d" (UID: "80cf5a83-936d-4789-a7bc-b91cdb0e564d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.846767 4795 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.847009 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.847100 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.277339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerDied","Data":"02964fff73e3665f58b601e9cc6da3da341c2ab0c31c859cfed6fac2de7b5310"} Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.277579 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02964fff73e3665f58b601e9cc6da3da341c2ab0c31c859cfed6fac2de7b5310" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.277483 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.430104 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6"] Mar 20 17:51:46 crc kubenswrapper[4795]: E0320 17:51:46.430515 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.430533 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.430798 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.431507 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.446854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6"] Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469067 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469081 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.563397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.564230 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.564559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.667271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.667614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.668494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.673040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.673424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.688342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.783495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:47 crc kubenswrapper[4795]: I0320 17:51:47.354505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6"] Mar 20 17:51:47 crc kubenswrapper[4795]: W0320 17:51:47.358195 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdb4943_60a1_41cc_aead_1702a4c1f68a.slice/crio-1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0 WatchSource:0}: Error finding container 1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0: Status 404 returned error can't find the container with id 1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0 Mar 20 17:51:48 crc kubenswrapper[4795]: I0320 17:51:48.309569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerStarted","Data":"6ddabdaea64d113fb6013d8724623c1c666cb6ccb8ea39aaa1327bb1e88278fe"} Mar 20 17:51:48 crc kubenswrapper[4795]: I0320 17:51:48.311219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerStarted","Data":"1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0"} Mar 20 17:51:48 crc kubenswrapper[4795]: I0320 17:51:48.341326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" podStartSLOduration=1.8062972259999999 podStartE2EDuration="2.34130195s" podCreationTimestamp="2026-03-20 17:51:46 +0000 UTC" firstStartedPulling="2026-03-20 17:51:47.362266895 +0000 UTC m=+2050.820298466" lastFinishedPulling="2026-03-20 17:51:47.897271609 +0000 UTC m=+2051.355303190" observedRunningTime="2026-03-20 17:51:48.337983547 +0000 UTC m=+2051.796015158" watchObservedRunningTime="2026-03-20 17:51:48.34130195 +0000 UTC m=+2051.799333531" Mar 20 17:51:54 crc kubenswrapper[4795]: I0320 17:51:54.126237 4795 scope.go:117] "RemoveContainer" containerID="c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb" Mar 20 17:51:56 crc kubenswrapper[4795]: I0320 17:51:56.398745 4795 generic.go:334] "Generic (PLEG): container finished" podID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerID="6ddabdaea64d113fb6013d8724623c1c666cb6ccb8ea39aaa1327bb1e88278fe" exitCode=0 Mar 20 17:51:56 crc kubenswrapper[4795]: I0320 17:51:56.398830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerDied","Data":"6ddabdaea64d113fb6013d8724623c1c666cb6ccb8ea39aaa1327bb1e88278fe"} Mar 20 17:51:57 crc kubenswrapper[4795]: I0320 17:51:57.879335 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.024754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.025113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.025152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.033914 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk" (OuterVolumeSpecName: "kube-api-access-fvsgk") pod "9cdb4943-60a1-41cc-aead-1702a4c1f68a" (UID: "9cdb4943-60a1-41cc-aead-1702a4c1f68a"). InnerVolumeSpecName "kube-api-access-fvsgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.074161 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory" (OuterVolumeSpecName: "inventory") pod "9cdb4943-60a1-41cc-aead-1702a4c1f68a" (UID: "9cdb4943-60a1-41cc-aead-1702a4c1f68a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.077570 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9cdb4943-60a1-41cc-aead-1702a4c1f68a" (UID: "9cdb4943-60a1-41cc-aead-1702a4c1f68a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.128945 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.129021 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.129041 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.430780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerDied","Data":"1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0"} Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.430819 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.430876 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.528851 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88"] Mar 20 17:51:58 crc kubenswrapper[4795]: E0320 17:51:58.529436 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.529463 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.529759 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.530618 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.533372 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.533449 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.533572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.534338 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.542102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.542290 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.542497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.552386 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88"] Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.643180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.643258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.643334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.647823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.653514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.666196 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.849420 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:59 crc kubenswrapper[4795]: I0320 17:51:59.392349 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88"] Mar 20 17:51:59 crc kubenswrapper[4795]: I0320 17:51:59.440003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerStarted","Data":"5364cbab99f63192734fe793bfb234d47c253a170235e5b57235660ea2366376"} Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.142656 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.144432 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.171853 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.173724 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.172209 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.176760 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"auto-csr-approver-29567152-ksnjt\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.186453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.279473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"auto-csr-approver-29567152-ksnjt\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.301102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"auto-csr-approver-29567152-ksnjt\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.451855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerStarted","Data":"01d757bb54c99abdbeddf07efb6aaf2657cdd1489743a59d6dc864872b8e779b"} Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.483280 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" podStartSLOduration=2.05557857 podStartE2EDuration="2.483243723s" podCreationTimestamp="2026-03-20 17:51:58 +0000 UTC" firstStartedPulling="2026-03-20 17:51:59.398901069 +0000 UTC m=+2062.856932620" lastFinishedPulling="2026-03-20 17:51:59.826566182 +0000 UTC m=+2063.284597773" observedRunningTime="2026-03-20 17:52:00.479758645 +0000 UTC m=+2063.937790206" watchObservedRunningTime="2026-03-20 17:52:00.483243723 +0000 UTC m=+2063.941275334" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.497782 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:01 crc kubenswrapper[4795]: W0320 17:52:01.013857 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8abf4de_a372_47df_b14c_490f1e084a56.slice/crio-5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5 WatchSource:0}: Error finding container 5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5: Status 404 returned error can't find the container with id 5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5 Mar 20 17:52:01 crc kubenswrapper[4795]: I0320 17:52:01.017616 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:52:01 crc kubenswrapper[4795]: I0320 17:52:01.461979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" event={"ID":"d8abf4de-a372-47df-b14c-490f1e084a56","Type":"ContainerStarted","Data":"5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5"} Mar 20 17:52:02 crc kubenswrapper[4795]: I0320 17:52:02.472960 4795 generic.go:334] "Generic (PLEG): container finished" podID="d8abf4de-a372-47df-b14c-490f1e084a56" containerID="fa7c9e74af14d50a1c364d101636ba64da237edd40eebd00160c638dba974672" exitCode=0 Mar 20 17:52:02 crc kubenswrapper[4795]: I0320 17:52:02.473036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" event={"ID":"d8abf4de-a372-47df-b14c-490f1e084a56","Type":"ContainerDied","Data":"fa7c9e74af14d50a1c364d101636ba64da237edd40eebd00160c638dba974672"} Mar 20 17:52:03 crc kubenswrapper[4795]: I0320 17:52:03.857099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:03 crc kubenswrapper[4795]: I0320 17:52:03.960053 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"d8abf4de-a372-47df-b14c-490f1e084a56\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " Mar 20 17:52:03 crc kubenswrapper[4795]: I0320 17:52:03.966266 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd" (OuterVolumeSpecName: "kube-api-access-5xwmd") pod "d8abf4de-a372-47df-b14c-490f1e084a56" (UID: "d8abf4de-a372-47df-b14c-490f1e084a56"). InnerVolumeSpecName "kube-api-access-5xwmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.062363 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.499995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" event={"ID":"d8abf4de-a372-47df-b14c-490f1e084a56","Type":"ContainerDied","Data":"5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5"} Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.500045 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.500110 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.956453 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.970048 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:52:05 crc kubenswrapper[4795]: I0320 17:52:05.268493 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" path="/var/lib/kubelet/pods/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c/volumes" Mar 20 17:52:10 crc kubenswrapper[4795]: I0320 17:52:10.564076 4795 generic.go:334] "Generic (PLEG): container finished" podID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerID="01d757bb54c99abdbeddf07efb6aaf2657cdd1489743a59d6dc864872b8e779b" exitCode=0 Mar 20 17:52:10 crc kubenswrapper[4795]: I0320 17:52:10.564186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerDied","Data":"01d757bb54c99abdbeddf07efb6aaf2657cdd1489743a59d6dc864872b8e779b"} Mar 20 17:52:11 crc kubenswrapper[4795]: I0320 17:52:11.935195 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.058654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.058751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.058972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.064371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp" (OuterVolumeSpecName: "kube-api-access-4bdvp") pod "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" (UID: "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4"). InnerVolumeSpecName "kube-api-access-4bdvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.091052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" (UID: "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.107531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory" (OuterVolumeSpecName: "inventory") pod "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" (UID: "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.161110 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.161144 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.161158 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.587435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerDied","Data":"5364cbab99f63192734fe793bfb234d47c253a170235e5b57235660ea2366376"} Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.587495 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5364cbab99f63192734fe793bfb234d47c253a170235e5b57235660ea2366376" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.587529 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707004 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5"] Mar 20 17:52:12 crc kubenswrapper[4795]: E0320 17:52:12.707389 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" containerName="oc" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707412 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" containerName="oc" Mar 20 17:52:12 crc kubenswrapper[4795]: E0320 17:52:12.707461 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707471 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707668 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" containerName="oc" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707730 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.708427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.720484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.721033 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.722872 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723664 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723673 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723916 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.726890 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.753992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5"] Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776556 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776592 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777075 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878016 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878161 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.887484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.887475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.887675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.888075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.888388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.888950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.889029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.889325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.890053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.891331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.892477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.893304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.895582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.898007 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:13 crc kubenswrapper[4795]: I0320 17:52:13.039636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:13 crc kubenswrapper[4795]: I0320 17:52:13.610188 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5"] Mar 20 17:52:13 crc kubenswrapper[4795]: W0320 17:52:13.612624 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ab0ae7d_87ee_4e3f_a963_d126c5ddab8c.slice/crio-e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6 WatchSource:0}: Error finding container e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6: Status 404 returned error can't find the container with id e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6 Mar 20 17:52:14 crc kubenswrapper[4795]: I0320 17:52:14.619440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerStarted","Data":"fbddf6801bd4755280989348b7233b7faafa5ea394bfbc59e7cb23626aa16c9a"} Mar 20 17:52:14 crc kubenswrapper[4795]: I0320 17:52:14.620118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerStarted","Data":"e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6"} Mar 20 17:52:14 crc kubenswrapper[4795]: I0320 17:52:14.666727 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" podStartSLOduration=2.264384852 podStartE2EDuration="2.666708826s" podCreationTimestamp="2026-03-20 17:52:12 +0000 UTC" firstStartedPulling="2026-03-20 17:52:13.615265576 +0000 UTC m=+2077.073297127" lastFinishedPulling="2026-03-20 17:52:14.01758952 +0000 UTC m=+2077.475621101" observedRunningTime="2026-03-20 17:52:14.650622266 +0000 UTC m=+2078.108653827" watchObservedRunningTime="2026-03-20 17:52:14.666708826 +0000 UTC m=+2078.124740387" Mar 20 17:52:52 crc kubenswrapper[4795]: I0320 17:52:52.028544 4795 generic.go:334] "Generic (PLEG): container finished" podID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerID="fbddf6801bd4755280989348b7233b7faafa5ea394bfbc59e7cb23626aa16c9a" exitCode=0 Mar 20 17:52:52 crc kubenswrapper[4795]: I0320 17:52:52.028640 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerDied","Data":"fbddf6801bd4755280989348b7233b7faafa5ea394bfbc59e7cb23626aa16c9a"} Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.538299 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724841 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725292 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.731982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.733122 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.733181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg" (OuterVolumeSpecName: "kube-api-access-brpkg") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "kube-api-access-brpkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.735615 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.735624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.736459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.737004 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.737646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.737882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.740242 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.742961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.747880 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.786063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.786547 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory" (OuterVolumeSpecName: "inventory") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828297 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828340 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828357 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828371 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828384 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828396 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828414 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828426 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828437 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828449 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828460 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828471 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828483 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828495 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.051340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerDied","Data":"e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6"} Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.051399 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.051503 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.223792 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45"] Mar 20 17:52:54 crc kubenswrapper[4795]: E0320 17:52:54.224480 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.224570 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.224821 4795 scope.go:117] "RemoveContainer" containerID="19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.225049 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.226147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.231780 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232009 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232284 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232493 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232637 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.238690 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45"] Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.347681 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348439 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.451005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.454254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.455004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.455207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.466159 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.592828 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:55 crc kubenswrapper[4795]: I0320 17:52:55.624661 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45"] Mar 20 17:52:56 crc kubenswrapper[4795]: I0320 17:52:56.170649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerStarted","Data":"885fd0d01e15e8e73814661bd573c22dd2c0eba50edad204e7a8f73bdb2ca832"} Mar 20 17:52:57 crc kubenswrapper[4795]: I0320 17:52:57.181909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerStarted","Data":"ce3704459df378ee719ee4944060e7d6335ef9231a60476865a774b316275a14"} Mar 20 17:52:57 crc kubenswrapper[4795]: I0320 17:52:57.225009 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" podStartSLOduration=2.621942797 podStartE2EDuration="3.2249921s" podCreationTimestamp="2026-03-20 17:52:54 +0000 UTC" firstStartedPulling="2026-03-20 17:52:55.631551018 +0000 UTC m=+2119.089582569" lastFinishedPulling="2026-03-20 17:52:56.234600291 +0000 UTC m=+2119.692631872" observedRunningTime="2026-03-20 17:52:57.21728055 +0000 UTC m=+2120.675312091" watchObservedRunningTime="2026-03-20 17:52:57.2249921 +0000 UTC m=+2120.683023641" Mar 20 17:53:11 crc kubenswrapper[4795]: I0320 17:53:11.299789 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:53:11 crc kubenswrapper[4795]: I0320 17:53:11.300478 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:53:21 crc kubenswrapper[4795]: I0320 17:53:21.949344 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:21 crc kubenswrapper[4795]: I0320 17:53:21.954801 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:21 crc kubenswrapper[4795]: I0320 17:53:21.966634 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.109643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.109830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.109913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.211560 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.211645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.211824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.212355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.212954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.237754 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.287391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.764754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:23 crc kubenswrapper[4795]: I0320 17:53:23.497039 4795 generic.go:334] "Generic (PLEG): container finished" podID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" exitCode=0 Mar 20 17:53:23 crc kubenswrapper[4795]: I0320 17:53:23.497083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872"} Mar 20 17:53:23 crc kubenswrapper[4795]: I0320 17:53:23.497301 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerStarted","Data":"98600df691122c949eb7ec07e25241678890358ec78c766404af0bcce9f83085"} Mar 20 17:53:24 crc kubenswrapper[4795]: I0320 17:53:24.512334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerStarted","Data":"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd"} Mar 20 17:53:27 crc kubenswrapper[4795]: I0320 17:53:27.542879 4795 generic.go:334] "Generic (PLEG): container finished" podID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" exitCode=0 Mar 20 17:53:27 crc kubenswrapper[4795]: I0320 17:53:27.542950 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd"} Mar 20 17:53:29 crc kubenswrapper[4795]: I0320 17:53:29.559970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerStarted","Data":"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f"} Mar 20 17:53:29 crc kubenswrapper[4795]: I0320 17:53:29.580662 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5z2tx" podStartSLOduration=3.6511068399999997 podStartE2EDuration="8.58064466s" podCreationTimestamp="2026-03-20 17:53:21 +0000 UTC" firstStartedPulling="2026-03-20 17:53:23.499112019 +0000 UTC m=+2146.957143560" lastFinishedPulling="2026-03-20 17:53:28.428649839 +0000 UTC m=+2151.886681380" observedRunningTime="2026-03-20 17:53:29.57875121 +0000 UTC m=+2153.036782751" watchObservedRunningTime="2026-03-20 17:53:29.58064466 +0000 UTC m=+2153.038676201" Mar 20 17:53:32 crc kubenswrapper[4795]: I0320 17:53:32.287719 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:32 crc kubenswrapper[4795]: I0320 17:53:32.287975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:33 crc kubenswrapper[4795]: I0320 17:53:33.386626 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5z2tx" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" probeResult="failure" output=< Mar 20 17:53:33 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:53:33 crc kubenswrapper[4795]: > Mar 20 17:53:41 crc kubenswrapper[4795]: I0320 17:53:41.300546 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:53:41 crc kubenswrapper[4795]: I0320 17:53:41.301181 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:53:42 crc kubenswrapper[4795]: I0320 17:53:42.334152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:42 crc kubenswrapper[4795]: I0320 17:53:42.393676 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:42 crc kubenswrapper[4795]: I0320 17:53:42.572633 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:43 crc kubenswrapper[4795]: I0320 17:53:43.686175 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5z2tx" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" containerID="cri-o://0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" gracePeriod=2 Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.105022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.245909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.246825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.247162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.247843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities" (OuterVolumeSpecName: "utilities") pod "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" (UID: "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.252020 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.253948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9" (OuterVolumeSpecName: "kube-api-access-4w8p9") pod "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" (UID: "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8"). InnerVolumeSpecName "kube-api-access-4w8p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.357508 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.437316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" (UID: "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.458679 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697062 4795 generic.go:334] "Generic (PLEG): container finished" podID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" exitCode=0 Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f"} Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697125 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"98600df691122c949eb7ec07e25241678890358ec78c766404af0bcce9f83085"} Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697164 4795 scope.go:117] "RemoveContainer" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.715931 4795 scope.go:117] "RemoveContainer" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.729759 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.737154 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.751361 4795 scope.go:117] "RemoveContainer" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.791777 4795 scope.go:117] "RemoveContainer" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" Mar 20 17:53:44 crc kubenswrapper[4795]: E0320 17:53:44.792322 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f\": container with ID starting with 0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f not found: ID does not exist" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792360 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f"} err="failed to get container status \"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f\": rpc error: code = NotFound desc = could not find container \"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f\": container with ID starting with 0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f not found: ID does not exist" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792390 4795 scope.go:117] "RemoveContainer" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" Mar 20 17:53:44 crc kubenswrapper[4795]: E0320 17:53:44.792846 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd\": container with ID starting with 4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd not found: ID does not exist" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792866 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd"} err="failed to get container status \"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd\": rpc error: code = NotFound desc = could not find container \"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd\": container with ID starting with 4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd not found: ID does not exist" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792879 4795 scope.go:117] "RemoveContainer" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" Mar 20 17:53:44 crc kubenswrapper[4795]: E0320 17:53:44.793268 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872\": container with ID starting with f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872 not found: ID does not exist" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.793301 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872"} err="failed to get container status \"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872\": rpc error: code = NotFound desc = could not find container \"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872\": container with ID starting with f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872 not found: ID does not exist" Mar 20 17:53:45 crc kubenswrapper[4795]: I0320 17:53:45.267638 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" path="/var/lib/kubelet/pods/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8/volumes" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.443896 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:53:52 crc kubenswrapper[4795]: E0320 17:53:52.444793 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-content" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.444805 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-content" Mar 20 17:53:52 crc kubenswrapper[4795]: E0320 17:53:52.444816 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.444822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" Mar 20 17:53:52 crc kubenswrapper[4795]: E0320 17:53:52.444839 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-utilities" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.444846 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-utilities" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.445028 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.446347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.463629 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.557812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.557969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.557992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.659741 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.659789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.659844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.660267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.660342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.679191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.779008 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.380343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.784376 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0450211-15da-4926-9f2c-f1169ac44b02" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" exitCode=0 Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.784475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9"} Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.784666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerStarted","Data":"bed2375c0e8c4495026b6591fb4eeb0ebb046e32c6a24c250a89855bed8b766d"} Mar 20 17:53:55 crc kubenswrapper[4795]: I0320 17:53:55.959075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerStarted","Data":"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e"} Mar 20 17:53:56 crc kubenswrapper[4795]: I0320 17:53:56.988278 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0450211-15da-4926-9f2c-f1169ac44b02" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" exitCode=0 Mar 20 17:53:56 crc kubenswrapper[4795]: I0320 17:53:56.988330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e"} Mar 20 17:53:58 crc kubenswrapper[4795]: I0320 17:53:58.002890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerStarted","Data":"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377"} Mar 20 17:53:58 crc kubenswrapper[4795]: I0320 17:53:58.036196 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t7p7j" podStartSLOduration=2.354875688 podStartE2EDuration="6.036177902s" podCreationTimestamp="2026-03-20 17:53:52 +0000 UTC" firstStartedPulling="2026-03-20 17:53:53.787098184 +0000 UTC m=+2177.245129735" lastFinishedPulling="2026-03-20 17:53:57.468400408 +0000 UTC m=+2180.926431949" observedRunningTime="2026-03-20 17:53:58.027648257 +0000 UTC m=+2181.485679838" watchObservedRunningTime="2026-03-20 17:53:58.036177902 +0000 UTC m=+2181.494209453" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.151972 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.155667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.162342 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.162373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.164648 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.200050 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.258379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"auto-csr-approver-29567154-4z2rq\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.360575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"auto-csr-approver-29567154-4z2rq\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.382372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"auto-csr-approver-29567154-4z2rq\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.490563 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.982115 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 17:54:00 crc kubenswrapper[4795]: W0320 17:54:00.986921 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61821949_5c88_4f4c_adab_b93269540a03.slice/crio-84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb WatchSource:0}: Error finding container 84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb: Status 404 returned error can't find the container with id 84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb Mar 20 17:54:01 crc kubenswrapper[4795]: I0320 17:54:01.029674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" event={"ID":"61821949-5c88-4f4c-adab-b93269540a03","Type":"ContainerStarted","Data":"84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb"} Mar 20 17:54:02 crc kubenswrapper[4795]: I0320 17:54:02.792792 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:02 crc kubenswrapper[4795]: I0320 17:54:02.793761 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:02 crc kubenswrapper[4795]: I0320 17:54:02.851671 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.051676 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c737290-0616-475b-a839-cca387d8d90d" containerID="ce3704459df378ee719ee4944060e7d6335ef9231a60476865a774b316275a14" exitCode=0 Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.051757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerDied","Data":"ce3704459df378ee719ee4944060e7d6335ef9231a60476865a774b316275a14"} Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.054467 4795 generic.go:334] "Generic (PLEG): container finished" podID="61821949-5c88-4f4c-adab-b93269540a03" containerID="6f68cab9e191fff6af7e246da5293fa0fd1c14c356566586bb75900bd179fcf6" exitCode=0 Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.054538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" event={"ID":"61821949-5c88-4f4c-adab-b93269540a03","Type":"ContainerDied","Data":"6f68cab9e191fff6af7e246da5293fa0fd1c14c356566586bb75900bd179fcf6"} Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.138576 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.189550 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.440309 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.539904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"61821949-5c88-4f4c-adab-b93269540a03\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.545849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj" (OuterVolumeSpecName: "kube-api-access-g89wj") pod "61821949-5c88-4f4c-adab-b93269540a03" (UID: "61821949-5c88-4f4c-adab-b93269540a03"). InnerVolumeSpecName "kube-api-access-g89wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.597843 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641920 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.648866 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f" (OuterVolumeSpecName: "kube-api-access-qqp2f") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "kube-api-access-qqp2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.648886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.667660 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.668629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory" (OuterVolumeSpecName: "inventory") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.689227 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743221 4795 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743273 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743292 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743312 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743331 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.087370 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.087365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerDied","Data":"885fd0d01e15e8e73814661bd573c22dd2c0eba50edad204e7a8f73bdb2ca832"} Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.087588 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885fd0d01e15e8e73814661bd573c22dd2c0eba50edad204e7a8f73bdb2ca832" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" event={"ID":"61821949-5c88-4f4c-adab-b93269540a03","Type":"ContainerDied","Data":"84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb"} Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090838 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090779 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090958 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t7p7j" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" containerID="cri-o://e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" gracePeriod=2 Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.207732 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7"] Mar 20 17:54:05 crc kubenswrapper[4795]: E0320 17:54:05.208147 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61821949-5c88-4f4c-adab-b93269540a03" containerName="oc" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208163 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="61821949-5c88-4f4c-adab-b93269540a03" containerName="oc" Mar 20 17:54:05 crc kubenswrapper[4795]: E0320 17:54:05.208202 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c737290-0616-475b-a839-cca387d8d90d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208211 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c737290-0616-475b-a839-cca387d8d90d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208428 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="61821949-5c88-4f4c-adab-b93269540a03" containerName="oc" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208459 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c737290-0616-475b-a839-cca387d8d90d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.209257 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.213798 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.215994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.216281 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.216600 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.216824 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.217019 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.224754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7"] Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255680 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358885 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.368139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.371857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.372779 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.377372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.377462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.378531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.515797 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.523787 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.541252 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.608291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.670203 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"c0450211-15da-4926-9f2c-f1169ac44b02\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.670248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"c0450211-15da-4926-9f2c-f1169ac44b02\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.670326 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"c0450211-15da-4926-9f2c-f1169ac44b02\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.677480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities" (OuterVolumeSpecName: "utilities") pod "c0450211-15da-4926-9f2c-f1169ac44b02" (UID: "c0450211-15da-4926-9f2c-f1169ac44b02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.698819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l" (OuterVolumeSpecName: "kube-api-access-mjx7l") pod "c0450211-15da-4926-9f2c-f1169ac44b02" (UID: "c0450211-15da-4926-9f2c-f1169ac44b02"). InnerVolumeSpecName "kube-api-access-mjx7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.773102 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.773451 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114892 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0450211-15da-4926-9f2c-f1169ac44b02" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" exitCode=0 Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114933 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377"} Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"bed2375c0e8c4495026b6591fb4eeb0ebb046e32c6a24c250a89855bed8b766d"} Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114977 4795 scope.go:117] "RemoveContainer" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.115052 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.141832 4795 scope.go:117] "RemoveContainer" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.144741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0450211-15da-4926-9f2c-f1169ac44b02" (UID: "c0450211-15da-4926-9f2c-f1169ac44b02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.168894 4795 scope.go:117] "RemoveContainer" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.186298 4795 scope.go:117] "RemoveContainer" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" Mar 20 17:54:06 crc kubenswrapper[4795]: E0320 17:54:06.186720 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377\": container with ID starting with e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377 not found: ID does not exist" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.186758 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377"} err="failed to get container status \"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377\": rpc error: code = NotFound desc = could not find container \"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377\": container with ID starting with e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377 not found: ID does not exist" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.186787 4795 scope.go:117] "RemoveContainer" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" Mar 20 17:54:06 crc kubenswrapper[4795]: E0320 17:54:06.187257 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e\": container with ID starting with c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e not found: ID does not exist" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.187282 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e"} err="failed to get container status \"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e\": rpc error: code = NotFound desc = could not find container \"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e\": container with ID starting with c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e not found: ID does not exist" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.187301 4795 scope.go:117] "RemoveContainer" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" Mar 20 17:54:06 crc kubenswrapper[4795]: E0320 17:54:06.187620 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9\": container with ID starting with f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9 not found: ID does not exist" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.187643 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9"} err="failed to get container status \"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9\": rpc error: code = NotFound desc = could not find container \"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9\": container with ID starting with f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9 not found: ID does not exist" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.209002 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.240681 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7"] Mar 20 17:54:06 crc kubenswrapper[4795]: W0320 17:54:06.242606 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode29f4857_ff0d_4806_ba09_74448200e8e2.slice/crio-cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d WatchSource:0}: Error finding container cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d: Status 404 returned error can't find the container with id cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.450007 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.458771 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:54:07 crc kubenswrapper[4795]: I0320 17:54:07.130264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerStarted","Data":"cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d"} Mar 20 17:54:07 crc kubenswrapper[4795]: I0320 17:54:07.270778 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" path="/var/lib/kubelet/pods/5541d8b2-57fb-4162-8ee0-ac6630a5d91c/volumes" Mar 20 17:54:07 crc kubenswrapper[4795]: I0320 17:54:07.272010 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" path="/var/lib/kubelet/pods/c0450211-15da-4926-9f2c-f1169ac44b02/volumes" Mar 20 17:54:08 crc kubenswrapper[4795]: I0320 17:54:08.145405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerStarted","Data":"577e1b360b8d0a6d1805b049f234defcfede1bfedd6ea85d12e6a79faf221601"} Mar 20 17:54:08 crc kubenswrapper[4795]: I0320 17:54:08.185468 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" podStartSLOduration=2.422390864 podStartE2EDuration="3.185441706s" podCreationTimestamp="2026-03-20 17:54:05 +0000 UTC" firstStartedPulling="2026-03-20 17:54:06.246275453 +0000 UTC m=+2189.704306994" lastFinishedPulling="2026-03-20 17:54:07.009326255 +0000 UTC m=+2190.467357836" observedRunningTime="2026-03-20 17:54:08.17337884 +0000 UTC m=+2191.631410461" watchObservedRunningTime="2026-03-20 17:54:08.185441706 +0000 UTC m=+2191.643473277" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.300440 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.300894 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.300970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.302100 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.302188 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5" gracePeriod=600 Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.189794 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5" exitCode=0 Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.189861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5"} Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.190504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f"} Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.190528 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.690892 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:37 crc kubenswrapper[4795]: E0320 17:54:37.691813 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.691827 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" Mar 20 17:54:37 crc kubenswrapper[4795]: E0320 17:54:37.691846 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-utilities" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.691855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-utilities" Mar 20 17:54:37 crc kubenswrapper[4795]: E0320 17:54:37.691882 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-content" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.691891 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-content" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.692107 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.698266 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.713279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.773975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-catalog-content\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.774616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-utilities\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.774845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wb6\" (UniqueName: \"kubernetes.io/projected/58169b5e-ad5e-4928-8511-1677518e9c01-kube-api-access-j2wb6\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.876445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wb6\" (UniqueName: \"kubernetes.io/projected/58169b5e-ad5e-4928-8511-1677518e9c01-kube-api-access-j2wb6\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.876514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-catalog-content\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.876602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-utilities\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.877238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-catalog-content\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.877290 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-utilities\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.908962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wb6\" (UniqueName: \"kubernetes.io/projected/58169b5e-ad5e-4928-8511-1677518e9c01-kube-api-access-j2wb6\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:38 crc kubenswrapper[4795]: I0320 17:54:38.060861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:38 crc kubenswrapper[4795]: I0320 17:54:38.560098 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.490496 4795 generic.go:334] "Generic (PLEG): container finished" podID="58169b5e-ad5e-4928-8511-1677518e9c01" containerID="4a2f3baa702fca105563aa68055d3a58583a358513f90dfc80bff4554e4f2096" exitCode=0 Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.490735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerDied","Data":"4a2f3baa702fca105563aa68055d3a58583a358513f90dfc80bff4554e4f2096"} Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.490923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerStarted","Data":"7a0b6420471b9f828cd1128c605846105b818dc7f2339e52c0d06e0ac67b44b1"} Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.493096 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:54:45 crc kubenswrapper[4795]: I0320 17:54:45.546277 4795 generic.go:334] "Generic (PLEG): container finished" podID="58169b5e-ad5e-4928-8511-1677518e9c01" containerID="7c278ea905649fc7c9070ed011d683ccaad85e86e73b928da4c656a69cedf2d6" exitCode=0 Mar 20 17:54:45 crc kubenswrapper[4795]: I0320 17:54:45.546472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerDied","Data":"7c278ea905649fc7c9070ed011d683ccaad85e86e73b928da4c656a69cedf2d6"} Mar 20 17:54:46 crc kubenswrapper[4795]: I0320 17:54:46.557007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerStarted","Data":"997556791f530fd72643777860c4e6ca0bf8ee815bffe9d233a1b1f1d9372b6c"} Mar 20 17:54:46 crc kubenswrapper[4795]: I0320 17:54:46.579964 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-965kf" podStartSLOduration=3.009004363 podStartE2EDuration="9.579941447s" podCreationTimestamp="2026-03-20 17:54:37 +0000 UTC" firstStartedPulling="2026-03-20 17:54:39.492815854 +0000 UTC m=+2222.950847405" lastFinishedPulling="2026-03-20 17:54:46.063752948 +0000 UTC m=+2229.521784489" observedRunningTime="2026-03-20 17:54:46.579032848 +0000 UTC m=+2230.037064389" watchObservedRunningTime="2026-03-20 17:54:46.579941447 +0000 UTC m=+2230.037973008" Mar 20 17:54:48 crc kubenswrapper[4795]: I0320 17:54:48.061938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:48 crc kubenswrapper[4795]: I0320 17:54:48.062340 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:48 crc kubenswrapper[4795]: I0320 17:54:48.118804 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:54 crc kubenswrapper[4795]: I0320 17:54:54.367449 4795 scope.go:117] "RemoveContainer" containerID="29ff925c9ace295b0e664d8ded17085588e84c04311df585ee77cb9a00150d0d" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.155076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.262217 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.322177 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.322470 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94mw5" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" containerID="cri-o://4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2" gracePeriod=2 Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732172 4795 generic.go:334] "Generic (PLEG): container finished" podID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerID="4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2" exitCode=0 Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2"} Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd"} Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732519 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.782016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.836945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.836998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.842301 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz" (OuterVolumeSpecName: "kube-api-access-th9xz") pod "a79f11dc-5b5e-4929-9a6f-281ade73c24a" (UID: "a79f11dc-5b5e-4929-9a6f-281ade73c24a"). InnerVolumeSpecName "kube-api-access-th9xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.855059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.855386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities" (OuterVolumeSpecName: "utilities") pod "a79f11dc-5b5e-4929-9a6f-281ade73c24a" (UID: "a79f11dc-5b5e-4929-9a6f-281ade73c24a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.856011 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.856035 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.888473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a79f11dc-5b5e-4929-9a6f-281ade73c24a" (UID: "a79f11dc-5b5e-4929-9a6f-281ade73c24a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.957460 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:59 crc kubenswrapper[4795]: I0320 17:54:59.738921 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:54:59 crc kubenswrapper[4795]: I0320 17:54:59.761722 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:54:59 crc kubenswrapper[4795]: I0320 17:54:59.772796 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:55:00 crc kubenswrapper[4795]: I0320 17:55:00.747804 4795 generic.go:334] "Generic (PLEG): container finished" podID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerID="577e1b360b8d0a6d1805b049f234defcfede1bfedd6ea85d12e6a79faf221601" exitCode=0 Mar 20 17:55:00 crc kubenswrapper[4795]: I0320 17:55:00.747992 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerDied","Data":"577e1b360b8d0a6d1805b049f234defcfede1bfedd6ea85d12e6a79faf221601"} Mar 20 17:55:01 crc kubenswrapper[4795]: I0320 17:55:01.269430 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" path="/var/lib/kubelet/pods/a79f11dc-5b5e-4929-9a6f-281ade73c24a/volumes" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.265022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.328984 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.329908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2" (OuterVolumeSpecName: "kube-api-access-4rrq2") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "kube-api-access-4rrq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.357017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.359324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.362862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory" (OuterVolumeSpecName: "inventory") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.374291 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426184 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426230 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426244 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426260 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426272 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426285 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.773293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerDied","Data":"cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d"} Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.773344 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.773982 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.886560 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q"] Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887290 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887322 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887352 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-utilities" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887361 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-utilities" Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887389 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887397 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887414 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-content" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887422 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-content" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887635 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887654 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.889016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892034 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892216 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892782 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892967 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.893050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.913603 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q"] Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.939878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940762 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.043382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.043918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.044173 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.044448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.044777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.048875 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.049838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.050111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.050426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.065762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.207861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.798479 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q"] Mar 20 17:55:04 crc kubenswrapper[4795]: I0320 17:55:04.789645 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerStarted","Data":"916d45486bc7c9429b47d1621e1445553bb690d12dcaa2aceb7cbd80e6648c0c"} Mar 20 17:55:04 crc kubenswrapper[4795]: I0320 17:55:04.790286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerStarted","Data":"ce67ae3c6d3b823d115ffd8cc57bd8b2b930ee03e39a23bfb62f0f80486fdc98"} Mar 20 17:55:04 crc kubenswrapper[4795]: I0320 17:55:04.809321 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" podStartSLOduration=2.3938161989999998 podStartE2EDuration="2.809302062s" podCreationTimestamp="2026-03-20 17:55:02 +0000 UTC" firstStartedPulling="2026-03-20 17:55:03.80564339 +0000 UTC m=+2247.263674931" lastFinishedPulling="2026-03-20 17:55:04.221129253 +0000 UTC m=+2247.679160794" observedRunningTime="2026-03-20 17:55:04.804893875 +0000 UTC m=+2248.262925436" watchObservedRunningTime="2026-03-20 17:55:04.809302062 +0000 UTC m=+2248.267333613" Mar 20 17:55:54 crc kubenswrapper[4795]: I0320 17:55:54.459337 4795 scope.go:117] "RemoveContainer" containerID="ec6a69189563a780b942ae970e8e1801846953cabcf1239c190354a1203053b4" Mar 20 17:55:54 crc kubenswrapper[4795]: I0320 17:55:54.502746 4795 scope.go:117] "RemoveContainer" containerID="4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2" Mar 20 17:55:54 crc kubenswrapper[4795]: I0320 17:55:54.588411 4795 scope.go:117] "RemoveContainer" containerID="88b194a74064309f622b8e25f76f210948d20e5936b41beb91453d2773fb7483" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.166241 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.169698 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.176764 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.177488 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.178558 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.198547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.200125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"auto-csr-approver-29567156-cp2gz\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.301926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"auto-csr-approver-29567156-cp2gz\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.332854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"auto-csr-approver-29567156-cp2gz\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.508565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:01 crc kubenswrapper[4795]: I0320 17:56:01.003432 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 17:56:01 crc kubenswrapper[4795]: I0320 17:56:01.467797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" event={"ID":"23b9cffc-8f64-481b-9f51-334e3e04ed7b","Type":"ContainerStarted","Data":"a30f48bfe88311e7f2682f264a4f3bc2e92038fc2943c09e9fcd3b4e26c94210"} Mar 20 17:56:03 crc kubenswrapper[4795]: I0320 17:56:03.492479 4795 generic.go:334] "Generic (PLEG): container finished" podID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerID="025d16245a433259b825961c6fb9d8ed0412608aa4b43ab349fe67ca35e229a7" exitCode=0 Mar 20 17:56:03 crc kubenswrapper[4795]: I0320 17:56:03.492617 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" event={"ID":"23b9cffc-8f64-481b-9f51-334e3e04ed7b","Type":"ContainerDied","Data":"025d16245a433259b825961c6fb9d8ed0412608aa4b43ab349fe67ca35e229a7"} Mar 20 17:56:04 crc kubenswrapper[4795]: I0320 17:56:04.908112 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.004510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.010649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6" (OuterVolumeSpecName: "kube-api-access-jwxd6") pod "23b9cffc-8f64-481b-9f51-334e3e04ed7b" (UID: "23b9cffc-8f64-481b-9f51-334e3e04ed7b"). InnerVolumeSpecName "kube-api-access-jwxd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.106954 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.515960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" event={"ID":"23b9cffc-8f64-481b-9f51-334e3e04ed7b","Type":"ContainerDied","Data":"a30f48bfe88311e7f2682f264a4f3bc2e92038fc2943c09e9fcd3b4e26c94210"} Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.516016 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30f48bfe88311e7f2682f264a4f3bc2e92038fc2943c09e9fcd3b4e26c94210" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.516037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:06 crc kubenswrapper[4795]: I0320 17:56:06.002894 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:56:06 crc kubenswrapper[4795]: I0320 17:56:06.012846 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:56:07 crc kubenswrapper[4795]: I0320 17:56:07.269010 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" path="/var/lib/kubelet/pods/5e628e06-ee48-4969-824e-fba400b67d3a/volumes" Mar 20 17:56:11 crc kubenswrapper[4795]: I0320 17:56:11.300135 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:56:11 crc kubenswrapper[4795]: I0320 17:56:11.300243 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.897473 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:25 crc kubenswrapper[4795]: E0320 17:56:25.899272 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerName="oc" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.899310 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerName="oc" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.899792 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerName="oc" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.902888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.945332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.983297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.983719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.983749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.085909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.112941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.239222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.714923 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.770289 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerStarted","Data":"7671c9adfde4a05a6d9e863366f8c916de7cad7cd8a8af69c46501654cef5383"} Mar 20 17:56:27 crc kubenswrapper[4795]: I0320 17:56:27.782801 4795 generic.go:334] "Generic (PLEG): container finished" podID="98c1605e-8284-489e-83f0-bab45156e299" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" exitCode=0 Mar 20 17:56:27 crc kubenswrapper[4795]: I0320 17:56:27.782909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c"} Mar 20 17:56:28 crc kubenswrapper[4795]: I0320 17:56:28.793526 4795 generic.go:334] "Generic (PLEG): container finished" podID="98c1605e-8284-489e-83f0-bab45156e299" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" exitCode=0 Mar 20 17:56:28 crc kubenswrapper[4795]: I0320 17:56:28.794561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d"} Mar 20 17:56:29 crc kubenswrapper[4795]: I0320 17:56:29.806710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerStarted","Data":"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe"} Mar 20 17:56:29 crc kubenswrapper[4795]: I0320 17:56:29.836062 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67sls" podStartSLOduration=3.4231254939999998 podStartE2EDuration="4.836038818s" podCreationTimestamp="2026-03-20 17:56:25 +0000 UTC" firstStartedPulling="2026-03-20 17:56:27.784909403 +0000 UTC m=+2331.242940974" lastFinishedPulling="2026-03-20 17:56:29.197822757 +0000 UTC m=+2332.655854298" observedRunningTime="2026-03-20 17:56:29.824816248 +0000 UTC m=+2333.282847789" watchObservedRunningTime="2026-03-20 17:56:29.836038818 +0000 UTC m=+2333.294070359" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.239813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.240441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.288460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.962741 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.117160 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.117726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67sls" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" containerID="cri-o://29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" gracePeriod=2 Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.605653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.714936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"98c1605e-8284-489e-83f0-bab45156e299\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.715167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"98c1605e-8284-489e-83f0-bab45156e299\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.715247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"98c1605e-8284-489e-83f0-bab45156e299\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.715971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities" (OuterVolumeSpecName: "utilities") pod "98c1605e-8284-489e-83f0-bab45156e299" (UID: "98c1605e-8284-489e-83f0-bab45156e299"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.720970 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk" (OuterVolumeSpecName: "kube-api-access-jwskk") pod "98c1605e-8284-489e-83f0-bab45156e299" (UID: "98c1605e-8284-489e-83f0-bab45156e299"). InnerVolumeSpecName "kube-api-access-jwskk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.743674 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98c1605e-8284-489e-83f0-bab45156e299" (UID: "98c1605e-8284-489e-83f0-bab45156e299"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.817850 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.817887 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.817902 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941282 4795 generic.go:334] "Generic (PLEG): container finished" podID="98c1605e-8284-489e-83f0-bab45156e299" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" exitCode=0 Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe"} Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"7671c9adfde4a05a6d9e863366f8c916de7cad7cd8a8af69c46501654cef5383"} Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941401 4795 scope.go:117] "RemoveContainer" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941340 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.986560 4795 scope.go:117] "RemoveContainer" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.989331 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.008177 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.033024 4795 scope.go:117] "RemoveContainer" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.072425 4795 scope.go:117] "RemoveContainer" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" Mar 20 17:56:41 crc kubenswrapper[4795]: E0320 17:56:41.073077 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe\": container with ID starting with 29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe not found: ID does not exist" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073138 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe"} err="failed to get container status \"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe\": rpc error: code = NotFound desc = could not find container \"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe\": container with ID starting with 29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe not found: ID does not exist" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073178 4795 scope.go:117] "RemoveContainer" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" Mar 20 17:56:41 crc kubenswrapper[4795]: E0320 17:56:41.073719 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d\": container with ID starting with 7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d not found: ID does not exist" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073751 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d"} err="failed to get container status \"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d\": rpc error: code = NotFound desc = could not find container \"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d\": container with ID starting with 7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d not found: ID does not exist" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073772 4795 scope.go:117] "RemoveContainer" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" Mar 20 17:56:41 crc kubenswrapper[4795]: E0320 17:56:41.074120 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c\": container with ID starting with a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c not found: ID does not exist" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.074190 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c"} err="failed to get container status \"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c\": rpc error: code = NotFound desc = could not find container \"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c\": container with ID starting with a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c not found: ID does not exist" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.273857 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c1605e-8284-489e-83f0-bab45156e299" path="/var/lib/kubelet/pods/98c1605e-8284-489e-83f0-bab45156e299/volumes" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.300970 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.301046 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:56:54 crc kubenswrapper[4795]: I0320 17:56:54.679273 4795 scope.go:117] "RemoveContainer" containerID="b0e8f1ce702c9e1cfb11740285e904a1e8d1f711ef3e97850efbb6236da59523" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.300225 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.300944 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.301010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.302108 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.302203 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" gracePeriod=600 Mar 20 17:57:11 crc kubenswrapper[4795]: E0320 17:57:11.429140 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.321196 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" exitCode=0 Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.321280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f"} Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.322461 4795 scope.go:117] "RemoveContainer" containerID="f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5" Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.323296 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:12 crc kubenswrapper[4795]: E0320 17:57:12.323753 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:27 crc kubenswrapper[4795]: I0320 17:57:27.260588 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:27 crc kubenswrapper[4795]: E0320 17:57:27.261757 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:38 crc kubenswrapper[4795]: I0320 17:57:38.252629 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:38 crc kubenswrapper[4795]: E0320 17:57:38.253938 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:52 crc kubenswrapper[4795]: I0320 17:57:52.252335 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:52 crc kubenswrapper[4795]: E0320 17:57:52.253370 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.153353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 17:58:00 crc kubenswrapper[4795]: E0320 17:58:00.154312 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-content" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154331 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-content" Mar 20 17:58:00 crc kubenswrapper[4795]: E0320 17:58:00.154361 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154370 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" Mar 20 17:58:00 crc kubenswrapper[4795]: E0320 17:58:00.154387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-utilities" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154395 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-utilities" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154616 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.155413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.158890 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.160154 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.161501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.162464 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.249394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"auto-csr-approver-29567158-45rzl\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.350918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"auto-csr-approver-29567158-45rzl\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.381247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"auto-csr-approver-29567158-45rzl\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.492749 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: W0320 17:58:00.930454 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08fd2b90_3d66_4e64_bbb0_c4eaf75e0aed.slice/crio-2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32 WatchSource:0}: Error finding container 2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32: Status 404 returned error can't find the container with id 2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32 Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.934835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 17:58:01 crc kubenswrapper[4795]: I0320 17:58:01.846044 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerStarted","Data":"2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32"} Mar 20 17:58:02 crc kubenswrapper[4795]: I0320 17:58:02.858966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerStarted","Data":"62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a"} Mar 20 17:58:02 crc kubenswrapper[4795]: I0320 17:58:02.881403 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567158-45rzl" podStartSLOduration=1.5707947199999999 podStartE2EDuration="2.881379475s" podCreationTimestamp="2026-03-20 17:58:00 +0000 UTC" firstStartedPulling="2026-03-20 17:58:00.934979433 +0000 UTC m=+2424.393011004" lastFinishedPulling="2026-03-20 17:58:02.245564208 +0000 UTC m=+2425.703595759" observedRunningTime="2026-03-20 17:58:02.873890842 +0000 UTC m=+2426.331922403" watchObservedRunningTime="2026-03-20 17:58:02.881379475 +0000 UTC m=+2426.339411046" Mar 20 17:58:03 crc kubenswrapper[4795]: I0320 17:58:03.871713 4795 generic.go:334] "Generic (PLEG): container finished" podID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerID="62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a" exitCode=0 Mar 20 17:58:03 crc kubenswrapper[4795]: I0320 17:58:03.871796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerDied","Data":"62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a"} Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.216791 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.353538 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.359899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf" (OuterVolumeSpecName: "kube-api-access-t5qxf") pod "08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" (UID: "08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed"). InnerVolumeSpecName "kube-api-access-t5qxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.456888 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") on node \"crc\" DevicePath \"\"" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.893156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerDied","Data":"2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32"} Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.893210 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.893284 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.965594 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.975570 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:58:06 crc kubenswrapper[4795]: I0320 17:58:06.252453 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:06 crc kubenswrapper[4795]: E0320 17:58:06.252942 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:07 crc kubenswrapper[4795]: I0320 17:58:07.264498 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" path="/var/lib/kubelet/pods/d8abf4de-a372-47df-b14c-490f1e084a56/volumes" Mar 20 17:58:20 crc kubenswrapper[4795]: I0320 17:58:20.252754 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:20 crc kubenswrapper[4795]: E0320 17:58:20.253841 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:35 crc kubenswrapper[4795]: I0320 17:58:35.251847 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:35 crc kubenswrapper[4795]: E0320 17:58:35.252439 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:50 crc kubenswrapper[4795]: I0320 17:58:50.252852 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:50 crc kubenswrapper[4795]: E0320 17:58:50.253670 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:54 crc kubenswrapper[4795]: I0320 17:58:54.842918 4795 scope.go:117] "RemoveContainer" containerID="fa7c9e74af14d50a1c364d101636ba64da237edd40eebd00160c638dba974672" Mar 20 17:59:05 crc kubenswrapper[4795]: I0320 17:59:05.253161 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:05 crc kubenswrapper[4795]: E0320 17:59:05.254283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:16 crc kubenswrapper[4795]: I0320 17:59:16.252759 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:16 crc kubenswrapper[4795]: E0320 17:59:16.253898 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:20 crc kubenswrapper[4795]: I0320 17:59:20.842316 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerID="916d45486bc7c9429b47d1621e1445553bb690d12dcaa2aceb7cbd80e6648c0c" exitCode=0 Mar 20 17:59:20 crc kubenswrapper[4795]: I0320 17:59:20.842667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerDied","Data":"916d45486bc7c9429b47d1621e1445553bb690d12dcaa2aceb7cbd80e6648c0c"} Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.361124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427532 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.433205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7" (OuterVolumeSpecName: "kube-api-access-dshp7") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "kube-api-access-dshp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.441878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.462220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.463235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory" (OuterVolumeSpecName: "inventory") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.468893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529388 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529420 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529431 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529449 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.863610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerDied","Data":"ce67ae3c6d3b823d115ffd8cc57bd8b2b930ee03e39a23bfb62f0f80486fdc98"} Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.864041 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce67ae3c6d3b823d115ffd8cc57bd8b2b930ee03e39a23bfb62f0f80486fdc98" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.863728 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.962415 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx"] Mar 20 17:59:22 crc kubenswrapper[4795]: E0320 17:59:22.962808 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerName="oc" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.962822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerName="oc" Mar 20 17:59:22 crc kubenswrapper[4795]: E0320 17:59:22.962838 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.962845 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.963009 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerName="oc" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.963030 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.964026 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.966115 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.966115 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.966733 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969391 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969625 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.994308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx"] Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.036995 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037166 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.140516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.143840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.143863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.144749 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.145047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.145654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.145909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.147376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.149171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.150380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.167581 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.287230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.917752 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx"] Mar 20 17:59:24 crc kubenswrapper[4795]: I0320 17:59:24.882173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerStarted","Data":"e18c048a597dc4d9d215e2e77b0cde124882e9a2ee7a96faf5c2c8ceff8b067d"} Mar 20 17:59:25 crc kubenswrapper[4795]: I0320 17:59:25.892083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerStarted","Data":"0c4a86a4e0bd9983c0776f522d0ac2f129ecc0ae67cd8e75fe2c36c5fe922436"} Mar 20 17:59:25 crc kubenswrapper[4795]: I0320 17:59:25.925717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" podStartSLOduration=3.139207976 podStartE2EDuration="3.925669226s" podCreationTimestamp="2026-03-20 17:59:22 +0000 UTC" firstStartedPulling="2026-03-20 17:59:23.919337796 +0000 UTC m=+2507.377369347" lastFinishedPulling="2026-03-20 17:59:24.705799016 +0000 UTC m=+2508.163830597" observedRunningTime="2026-03-20 17:59:25.91681053 +0000 UTC m=+2509.374842081" watchObservedRunningTime="2026-03-20 17:59:25.925669226 +0000 UTC m=+2509.383700787" Mar 20 17:59:27 crc kubenswrapper[4795]: I0320 17:59:27.267636 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:27 crc kubenswrapper[4795]: E0320 17:59:27.268143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:42 crc kubenswrapper[4795]: I0320 17:59:42.252851 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:42 crc kubenswrapper[4795]: E0320 17:59:42.253822 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:55 crc kubenswrapper[4795]: I0320 17:59:55.253181 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:55 crc kubenswrapper[4795]: E0320 17:59:55.254168 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.174041 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.176186 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.180468 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.180898 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.182821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.182903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.183161 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.192799 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.263195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.264754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.267711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.267807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.269349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.286276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"auto-csr-approver-29567160-k8xrk\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.286845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.286976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.287040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.289137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.293835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.305683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.310481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.389171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"auto-csr-approver-29567160-k8xrk\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.413949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"auto-csr-approver-29567160-k8xrk\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.502126 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.588799 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:01 crc kubenswrapper[4795]: W0320 18:00:01.009534 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06aca85b_9cb4_47ae_ad12_b1cc429c542d.slice/crio-25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2 WatchSource:0}: Error finding container 25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2: Status 404 returned error can't find the container with id 25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2 Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.013367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.101399 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:00:01 crc kubenswrapper[4795]: W0320 18:00:01.101936 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda01c5ccc_0cea_415d_969d_64f17a21036b.slice/crio-c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a WatchSource:0}: Error finding container c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a: Status 404 returned error can't find the container with id c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.104698 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.294007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerStarted","Data":"1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e"} Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.294342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerStarted","Data":"25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2"} Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.295566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" event={"ID":"a01c5ccc-0cea-415d-969d-64f17a21036b","Type":"ContainerStarted","Data":"c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a"} Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.330551 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" podStartSLOduration=1.330530772 podStartE2EDuration="1.330530772s" podCreationTimestamp="2026-03-20 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:00:01.322037568 +0000 UTC m=+2544.780069109" watchObservedRunningTime="2026-03-20 18:00:01.330530772 +0000 UTC m=+2544.788562313" Mar 20 18:00:02 crc kubenswrapper[4795]: I0320 18:00:02.306779 4795 generic.go:334] "Generic (PLEG): container finished" podID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerID="1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e" exitCode=0 Mar 20 18:00:02 crc kubenswrapper[4795]: I0320 18:00:02.306841 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerDied","Data":"1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e"} Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.656999 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.851132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.851234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.851374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.852312 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume" (OuterVolumeSpecName: "config-volume") pod "06aca85b-9cb4-47ae-ad12-b1cc429c542d" (UID: "06aca85b-9cb4-47ae-ad12-b1cc429c542d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.856898 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06aca85b-9cb4-47ae-ad12-b1cc429c542d" (UID: "06aca85b-9cb4-47ae-ad12-b1cc429c542d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.861919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95" (OuterVolumeSpecName: "kube-api-access-zsz95") pod "06aca85b-9cb4-47ae-ad12-b1cc429c542d" (UID: "06aca85b-9cb4-47ae-ad12-b1cc429c542d"). InnerVolumeSpecName "kube-api-access-zsz95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.953752 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.954115 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.954187 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.333808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerDied","Data":"25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2"} Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.333878 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2" Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.333881 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.418387 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.427877 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 18:00:05 crc kubenswrapper[4795]: I0320 18:00:05.267147 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" path="/var/lib/kubelet/pods/918aa57e-8c94-4427-b6bd-218a5687d684/volumes" Mar 20 18:00:08 crc kubenswrapper[4795]: I0320 18:00:08.253189 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:08 crc kubenswrapper[4795]: E0320 18:00:08.253800 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:19 crc kubenswrapper[4795]: I0320 18:00:19.253278 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:19 crc kubenswrapper[4795]: E0320 18:00:19.254128 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:26 crc kubenswrapper[4795]: I0320 18:00:26.565762 4795 generic.go:334] "Generic (PLEG): container finished" podID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerID="476381efe19b63607c82b31ec27f9dd8365ccb9f0dd4f0e61aba026eba68e6a2" exitCode=0 Mar 20 18:00:26 crc kubenswrapper[4795]: I0320 18:00:26.565872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" event={"ID":"a01c5ccc-0cea-415d-969d-64f17a21036b","Type":"ContainerDied","Data":"476381efe19b63607c82b31ec27f9dd8365ccb9f0dd4f0e61aba026eba68e6a2"} Mar 20 18:00:27 crc kubenswrapper[4795]: I0320 18:00:27.931903 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.001139 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"a01c5ccc-0cea-415d-969d-64f17a21036b\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.006411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v" (OuterVolumeSpecName: "kube-api-access-zk24v") pod "a01c5ccc-0cea-415d-969d-64f17a21036b" (UID: "a01c5ccc-0cea-415d-969d-64f17a21036b"). InnerVolumeSpecName "kube-api-access-zk24v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.102884 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.590331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" event={"ID":"a01c5ccc-0cea-415d-969d-64f17a21036b","Type":"ContainerDied","Data":"c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a"} Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.590378 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.590383 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:29 crc kubenswrapper[4795]: I0320 18:00:29.002605 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 18:00:29 crc kubenswrapper[4795]: I0320 18:00:29.009613 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 18:00:29 crc kubenswrapper[4795]: I0320 18:00:29.267886 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61821949-5c88-4f4c-adab-b93269540a03" path="/var/lib/kubelet/pods/61821949-5c88-4f4c-adab-b93269540a03/volumes" Mar 20 18:00:32 crc kubenswrapper[4795]: I0320 18:00:32.252237 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:32 crc kubenswrapper[4795]: E0320 18:00:32.252941 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:47 crc kubenswrapper[4795]: I0320 18:00:47.257856 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:47 crc kubenswrapper[4795]: E0320 18:00:47.258519 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:54 crc kubenswrapper[4795]: I0320 18:00:54.955627 4795 scope.go:117] "RemoveContainer" containerID="6f68cab9e191fff6af7e246da5293fa0fd1c14c356566586bb75900bd179fcf6" Mar 20 18:00:54 crc kubenswrapper[4795]: I0320 18:00:54.997246 4795 scope.go:117] "RemoveContainer" containerID="1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.180873 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567161-t26vc"] Mar 20 18:01:00 crc kubenswrapper[4795]: E0320 18:01:00.183155 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerName="collect-profiles" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183198 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerName="collect-profiles" Mar 20 18:01:00 crc kubenswrapper[4795]: E0320 18:01:00.183214 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerName="oc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerName="oc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183745 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerName="collect-profiles" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183949 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerName="oc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.187586 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.189820 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-t26vc"] Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.252158 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:00 crc kubenswrapper[4795]: E0320 18:01:00.252511 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366561 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.372331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.375816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.378810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.387812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.527462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.990610 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-t26vc"] Mar 20 18:01:01 crc kubenswrapper[4795]: I0320 18:01:01.880553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerStarted","Data":"91a2f5603edcd1f18f4d99385ba309c78145d20156d550a342b1889c19be8173"} Mar 20 18:01:01 crc kubenswrapper[4795]: I0320 18:01:01.880838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerStarted","Data":"dbf97ccf4d81b44dc6b8e01bef5228dd6f8628066161ee1e7b1585a28ab22358"} Mar 20 18:01:01 crc kubenswrapper[4795]: I0320 18:01:01.919085 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567161-t26vc" podStartSLOduration=1.9190604470000001 podStartE2EDuration="1.919060447s" podCreationTimestamp="2026-03-20 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:01:01.914260787 +0000 UTC m=+2605.372292338" watchObservedRunningTime="2026-03-20 18:01:01.919060447 +0000 UTC m=+2605.377091988" Mar 20 18:01:03 crc kubenswrapper[4795]: I0320 18:01:03.903402 4795 generic.go:334] "Generic (PLEG): container finished" podID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerID="91a2f5603edcd1f18f4d99385ba309c78145d20156d550a342b1889c19be8173" exitCode=0 Mar 20 18:01:03 crc kubenswrapper[4795]: I0320 18:01:03.903837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerDied","Data":"91a2f5603edcd1f18f4d99385ba309c78145d20156d550a342b1889c19be8173"} Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.294839 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375341 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375744 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.382878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp" (OuterVolumeSpecName: "kube-api-access-m4qsp") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "kube-api-access-m4qsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.392946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.428620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.448106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data" (OuterVolumeSpecName: "config-data") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478820 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478835 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478847 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.926214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerDied","Data":"dbf97ccf4d81b44dc6b8e01bef5228dd6f8628066161ee1e7b1585a28ab22358"} Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.926248 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf97ccf4d81b44dc6b8e01bef5228dd6f8628066161ee1e7b1585a28ab22358" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.926290 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:11 crc kubenswrapper[4795]: I0320 18:01:11.253023 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:11 crc kubenswrapper[4795]: E0320 18:01:11.253616 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:22 crc kubenswrapper[4795]: I0320 18:01:22.251987 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:22 crc kubenswrapper[4795]: E0320 18:01:22.252522 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:37 crc kubenswrapper[4795]: I0320 18:01:37.260766 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:37 crc kubenswrapper[4795]: E0320 18:01:37.261982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:49 crc kubenswrapper[4795]: I0320 18:01:49.252293 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:49 crc kubenswrapper[4795]: E0320 18:01:49.253300 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.163367 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:02:00 crc kubenswrapper[4795]: E0320 18:02:00.164307 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.164320 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.164531 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.165303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.174657 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.175012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.176110 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.176882 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.327536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"auto-csr-approver-29567162-729s4\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.430492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"auto-csr-approver-29567162-729s4\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.461842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"auto-csr-approver-29567162-729s4\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.488248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.985830 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:02:01 crc kubenswrapper[4795]: I0320 18:02:01.433974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-729s4" event={"ID":"3f28413f-4baf-4c13-bfaa-dc76fcb80e65","Type":"ContainerStarted","Data":"5e90451ce58885bdbe5b6c79c389be731f2b6a7327ef8c4d734e1350341b943b"} Mar 20 18:02:02 crc kubenswrapper[4795]: I0320 18:02:02.445221 4795 generic.go:334] "Generic (PLEG): container finished" podID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerID="d974ca7e16de9404ab80c37c963c39d73239baae8fb9a3246b8e5e345f171158" exitCode=0 Mar 20 18:02:02 crc kubenswrapper[4795]: I0320 18:02:02.445321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-729s4" event={"ID":"3f28413f-4baf-4c13-bfaa-dc76fcb80e65","Type":"ContainerDied","Data":"d974ca7e16de9404ab80c37c963c39d73239baae8fb9a3246b8e5e345f171158"} Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.252466 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:02:03 crc kubenswrapper[4795]: E0320 18:02:03.252944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.776616 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.891929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.899180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f" (OuterVolumeSpecName: "kube-api-access-dkt5f") pod "3f28413f-4baf-4c13-bfaa-dc76fcb80e65" (UID: "3f28413f-4baf-4c13-bfaa-dc76fcb80e65"). InnerVolumeSpecName "kube-api-access-dkt5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.994761 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.462574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-729s4" event={"ID":"3f28413f-4baf-4c13-bfaa-dc76fcb80e65","Type":"ContainerDied","Data":"5e90451ce58885bdbe5b6c79c389be731f2b6a7327ef8c4d734e1350341b943b"} Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.462622 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e90451ce58885bdbe5b6c79c389be731f2b6a7327ef8c4d734e1350341b943b" Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.462676 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.840607 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.849928 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 18:02:05 crc kubenswrapper[4795]: I0320 18:02:05.271359 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" path="/var/lib/kubelet/pods/23b9cffc-8f64-481b-9f51-334e3e04ed7b/volumes" Mar 20 18:02:11 crc kubenswrapper[4795]: I0320 18:02:11.526487 4795 generic.go:334] "Generic (PLEG): container finished" podID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerID="0c4a86a4e0bd9983c0776f522d0ac2f129ecc0ae67cd8e75fe2c36c5fe922436" exitCode=0 Mar 20 18:02:11 crc kubenswrapper[4795]: I0320 18:02:11.526582 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerDied","Data":"0c4a86a4e0bd9983c0776f522d0ac2f129ecc0ae67cd8e75fe2c36c5fe922436"} Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.074185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181811 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181890 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.182084 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.182217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.182276 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.193039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.204609 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n" (OuterVolumeSpecName: "kube-api-access-7nl9n") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "kube-api-access-7nl9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.217170 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.217174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.217818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.224346 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.224586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.230303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory" (OuterVolumeSpecName: "inventory") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.234237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.238589 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.243001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287494 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287801 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287822 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287840 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287857 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288000 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288072 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288130 4795 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288308 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288337 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288358 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.550017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerDied","Data":"e18c048a597dc4d9d215e2e77b0cde124882e9a2ee7a96faf5c2c8ceff8b067d"} Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.550071 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18c048a597dc4d9d215e2e77b0cde124882e9a2ee7a96faf5c2c8ceff8b067d" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.550126 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.816244 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh"] Mar 20 18:02:13 crc kubenswrapper[4795]: E0320 18:02:13.831840 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832057 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: E0320 18:02:13.832131 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerName="oc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832194 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerName="oc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832445 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerName="oc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832535 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.833205 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.835477 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836082 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836097 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836249 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.844156 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh"] Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.905247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.906859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.906997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.907117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.909849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.909953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.910066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.010712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.010974 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011081 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.017270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.019183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.019639 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.020243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.020641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.033132 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.033939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.165576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.685149 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh"] Mar 20 18:02:15 crc kubenswrapper[4795]: I0320 18:02:15.272295 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:02:15 crc kubenswrapper[4795]: I0320 18:02:15.577159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerStarted","Data":"01bbbaa61b45180cf8109cf1e889d03b1b5c7df633fdfee4f8bb470cfabf15d5"} Mar 20 18:02:16 crc kubenswrapper[4795]: I0320 18:02:16.590274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb"} Mar 20 18:02:16 crc kubenswrapper[4795]: I0320 18:02:16.592320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerStarted","Data":"82decdc91bfadfc70aba4b8aca3b29297f575d87a05741abcadb41a86f6b2aa7"} Mar 20 18:02:55 crc kubenswrapper[4795]: I0320 18:02:55.103740 4795 scope.go:117] "RemoveContainer" containerID="025d16245a433259b825961c6fb9d8ed0412608aa4b43ab349fe67ca35e229a7" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.152471 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" podStartSLOduration=106.418763605 podStartE2EDuration="1m47.152452441s" podCreationTimestamp="2026-03-20 18:02:13 +0000 UTC" firstStartedPulling="2026-03-20 18:02:14.684504769 +0000 UTC m=+2678.142536330" lastFinishedPulling="2026-03-20 18:02:15.418193615 +0000 UTC m=+2678.876225166" observedRunningTime="2026-03-20 18:02:16.645792465 +0000 UTC m=+2680.103824016" watchObservedRunningTime="2026-03-20 18:04:00.152452441 +0000 UTC m=+2783.610483982" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.155758 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.157113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.159872 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.160170 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.161376 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.189957 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.215841 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"auto-csr-approver-29567164-j2kpr\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.317930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"auto-csr-approver-29567164-j2kpr\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.339264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"auto-csr-approver-29567164-j2kpr\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.486424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.961511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:04:01 crc kubenswrapper[4795]: I0320 18:04:01.716445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" event={"ID":"42c32f39-7999-4aa6-be6f-bdfc11c61cf8","Type":"ContainerStarted","Data":"e1ed5eb4e516f4be735785fac8f3f2e20c5ee1c9acfa50fb7f7a2c53e1b37596"} Mar 20 18:04:02 crc kubenswrapper[4795]: I0320 18:04:02.730154 4795 generic.go:334] "Generic (PLEG): container finished" podID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerID="b5132a3bc3cc2a9d3f7f72ee052f22ef62d6486d39d7ff63bd2d4ca8c43eb377" exitCode=0 Mar 20 18:04:02 crc kubenswrapper[4795]: I0320 18:04:02.730356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" event={"ID":"42c32f39-7999-4aa6-be6f-bdfc11c61cf8","Type":"ContainerDied","Data":"b5132a3bc3cc2a9d3f7f72ee052f22ef62d6486d39d7ff63bd2d4ca8c43eb377"} Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.125964 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.192561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.199974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5" (OuterVolumeSpecName: "kube-api-access-whjk5") pod "42c32f39-7999-4aa6-be6f-bdfc11c61cf8" (UID: "42c32f39-7999-4aa6-be6f-bdfc11c61cf8"). InnerVolumeSpecName "kube-api-access-whjk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.294160 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") on node \"crc\" DevicePath \"\"" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.755603 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" event={"ID":"42c32f39-7999-4aa6-be6f-bdfc11c61cf8","Type":"ContainerDied","Data":"e1ed5eb4e516f4be735785fac8f3f2e20c5ee1c9acfa50fb7f7a2c53e1b37596"} Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.755639 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ed5eb4e516f4be735785fac8f3f2e20c5ee1c9acfa50fb7f7a2c53e1b37596" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.755798 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:05 crc kubenswrapper[4795]: I0320 18:04:05.230808 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 18:04:05 crc kubenswrapper[4795]: I0320 18:04:05.239160 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 18:04:05 crc kubenswrapper[4795]: I0320 18:04:05.261014 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" path="/var/lib/kubelet/pods/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed/volumes" Mar 20 18:04:41 crc kubenswrapper[4795]: I0320 18:04:41.300581 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:04:41 crc kubenswrapper[4795]: I0320 18:04:41.302382 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:04:55 crc kubenswrapper[4795]: I0320 18:04:55.227729 4795 scope.go:117] "RemoveContainer" containerID="62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a" Mar 20 18:05:11 crc kubenswrapper[4795]: I0320 18:05:11.300042 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:05:11 crc kubenswrapper[4795]: I0320 18:05:11.300902 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:05:36 crc kubenswrapper[4795]: I0320 18:05:36.711026 4795 generic.go:334] "Generic (PLEG): container finished" podID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerID="82decdc91bfadfc70aba4b8aca3b29297f575d87a05741abcadb41a86f6b2aa7" exitCode=0 Mar 20 18:05:36 crc kubenswrapper[4795]: I0320 18:05:36.711113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerDied","Data":"82decdc91bfadfc70aba4b8aca3b29297f575d87a05741abcadb41a86f6b2aa7"} Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.195383 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.332051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.332932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8" (OuterVolumeSpecName: "kube-api-access-ndcs8") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "kube-api-access-ndcs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.357649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.366999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory" (OuterVolumeSpecName: "inventory") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.369948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.375110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.382579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428193 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428419 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428462 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428476 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428490 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428504 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428517 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.736342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerDied","Data":"01bbbaa61b45180cf8109cf1e889d03b1b5c7df633fdfee4f8bb470cfabf15d5"} Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.736747 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01bbbaa61b45180cf8109cf1e889d03b1b5c7df633fdfee4f8bb470cfabf15d5" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.736428 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.299815 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.302525 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.302774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.304183 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.304474 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb" gracePeriod=600 Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767429 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb" exitCode=0 Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb"} Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5"} Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767859 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.030670 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:00 crc kubenswrapper[4795]: E0320 18:06:00.039306 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerName="oc" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.039479 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerName="oc" Mar 20 18:06:00 crc kubenswrapper[4795]: E0320 18:06:00.039584 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.039723 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.040043 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerName="oc" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.040173 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.041867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.054553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.164586 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.166527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.170369 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.171122 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.171303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.174509 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.209727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.209784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.209878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.311387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.311786 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"auto-csr-approver-29567166-n9tld\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.311931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.312010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.312087 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.312396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.332285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.386764 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.413792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"auto-csr-approver-29567166-n9tld\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.438962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"auto-csr-approver-29567166-n9tld\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.495058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.876383 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:00 crc kubenswrapper[4795]: W0320 18:06:00.879234 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709ba7cf_f8f7_4741_a050_f10234db1ff3.slice/crio-c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec WatchSource:0}: Error finding container c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec: Status 404 returned error can't find the container with id c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.988061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:06:00 crc kubenswrapper[4795]: W0320 18:06:00.990280 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26125bad_5b31_4c3d_901b_758cb842af78.slice/crio-89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083 WatchSource:0}: Error finding container 89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083: Status 404 returned error can't find the container with id 89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083 Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.991854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerStarted","Data":"c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec"} Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.992838 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:06:02 crc kubenswrapper[4795]: I0320 18:06:02.005487 4795 generic.go:334] "Generic (PLEG): container finished" podID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" exitCode=0 Mar 20 18:06:02 crc kubenswrapper[4795]: I0320 18:06:02.005552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d"} Mar 20 18:06:02 crc kubenswrapper[4795]: I0320 18:06:02.008241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-n9tld" event={"ID":"26125bad-5b31-4c3d-901b-758cb842af78","Type":"ContainerStarted","Data":"89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083"} Mar 20 18:06:03 crc kubenswrapper[4795]: I0320 18:06:03.019216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerStarted","Data":"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a"} Mar 20 18:06:03 crc kubenswrapper[4795]: I0320 18:06:03.021705 4795 generic.go:334] "Generic (PLEG): container finished" podID="26125bad-5b31-4c3d-901b-758cb842af78" containerID="388b030bf3690b9a4b1c0fb962fdb64c5a18c01bd7022e209c55231177d61b95" exitCode=0 Mar 20 18:06:03 crc kubenswrapper[4795]: I0320 18:06:03.021798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-n9tld" event={"ID":"26125bad-5b31-4c3d-901b-758cb842af78","Type":"ContainerDied","Data":"388b030bf3690b9a4b1c0fb962fdb64c5a18c01bd7022e209c55231177d61b95"} Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.033799 4795 generic.go:334] "Generic (PLEG): container finished" podID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" exitCode=0 Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.034036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a"} Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.431965 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.593957 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"26125bad-5b31-4c3d-901b-758cb842af78\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.600758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2" (OuterVolumeSpecName: "kube-api-access-bgbt2") pod "26125bad-5b31-4c3d-901b-758cb842af78" (UID: "26125bad-5b31-4c3d-901b-758cb842af78"). InnerVolumeSpecName "kube-api-access-bgbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.696521 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.047750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.051235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-n9tld" event={"ID":"26125bad-5b31-4c3d-901b-758cb842af78","Type":"ContainerDied","Data":"89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083"} Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.051427 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.055389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerStarted","Data":"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379"} Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.084891 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cgvrw" podStartSLOduration=3.667673141 podStartE2EDuration="6.084870271s" podCreationTimestamp="2026-03-20 18:05:59 +0000 UTC" firstStartedPulling="2026-03-20 18:06:02.008994219 +0000 UTC m=+2905.467025770" lastFinishedPulling="2026-03-20 18:06:04.426191359 +0000 UTC m=+2907.884222900" observedRunningTime="2026-03-20 18:06:05.078137452 +0000 UTC m=+2908.536169043" watchObservedRunningTime="2026-03-20 18:06:05.084870271 +0000 UTC m=+2908.542901812" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.507413 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.514124 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:06:07 crc kubenswrapper[4795]: I0320 18:06:07.282271 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" path="/var/lib/kubelet/pods/a01c5ccc-0cea-415d-969d-64f17a21036b/volumes" Mar 20 18:06:10 crc kubenswrapper[4795]: I0320 18:06:10.386967 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:10 crc kubenswrapper[4795]: I0320 18:06:10.387477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:10 crc kubenswrapper[4795]: I0320 18:06:10.475124 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:11 crc kubenswrapper[4795]: I0320 18:06:11.189355 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:11 crc kubenswrapper[4795]: I0320 18:06:11.274798 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.145300 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cgvrw" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" containerID="cri-o://ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" gracePeriod=2 Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.706741 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.802164 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"709ba7cf-f8f7-4741-a050-f10234db1ff3\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.802285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"709ba7cf-f8f7-4741-a050-f10234db1ff3\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.802413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"709ba7cf-f8f7-4741-a050-f10234db1ff3\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.803544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities" (OuterVolumeSpecName: "utilities") pod "709ba7cf-f8f7-4741-a050-f10234db1ff3" (UID: "709ba7cf-f8f7-4741-a050-f10234db1ff3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.809852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj" (OuterVolumeSpecName: "kube-api-access-xbfnj") pod "709ba7cf-f8f7-4741-a050-f10234db1ff3" (UID: "709ba7cf-f8f7-4741-a050-f10234db1ff3"). InnerVolumeSpecName "kube-api-access-xbfnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.851225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "709ba7cf-f8f7-4741-a050-f10234db1ff3" (UID: "709ba7cf-f8f7-4741-a050-f10234db1ff3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.904643 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.904681 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.904701 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159024 4795 generic.go:334] "Generic (PLEG): container finished" podID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" exitCode=0 Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379"} Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec"} Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159446 4795 scope.go:117] "RemoveContainer" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159169 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.200310 4795 scope.go:117] "RemoveContainer" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.203028 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.214462 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.235394 4795 scope.go:117] "RemoveContainer" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.263819 4795 scope.go:117] "RemoveContainer" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" Mar 20 18:06:14 crc kubenswrapper[4795]: E0320 18:06:14.264297 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379\": container with ID starting with ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379 not found: ID does not exist" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264340 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379"} err="failed to get container status \"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379\": rpc error: code = NotFound desc = could not find container \"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379\": container with ID starting with ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379 not found: ID does not exist" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264366 4795 scope.go:117] "RemoveContainer" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" Mar 20 18:06:14 crc kubenswrapper[4795]: E0320 18:06:14.264882 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a\": container with ID starting with 8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a not found: ID does not exist" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264917 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a"} err="failed to get container status \"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a\": rpc error: code = NotFound desc = could not find container \"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a\": container with ID starting with 8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a not found: ID does not exist" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264943 4795 scope.go:117] "RemoveContainer" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" Mar 20 18:06:14 crc kubenswrapper[4795]: E0320 18:06:14.265530 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d\": container with ID starting with f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d not found: ID does not exist" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.265564 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d"} err="failed to get container status \"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d\": rpc error: code = NotFound desc = could not find container \"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d\": container with ID starting with f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d not found: ID does not exist" Mar 20 18:06:15 crc kubenswrapper[4795]: I0320 18:06:15.264322 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" path="/var/lib/kubelet/pods/709ba7cf-f8f7-4741-a050-f10234db1ff3/volumes" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.139217 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140383 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-content" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-content" Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140440 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140453 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140468 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-utilities" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140481 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-utilities" Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140530 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26125bad-5b31-4c3d-901b-758cb842af78" containerName="oc" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140543 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26125bad-5b31-4c3d-901b-758cb842af78" containerName="oc" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140910 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140936 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="26125bad-5b31-4c3d-901b-758cb842af78" containerName="oc" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.141900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.146009 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.147151 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.148583 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zgwjr" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.148982 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.167748 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.233899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.233957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.233981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234138 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234368 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337219 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.338045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.338105 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.338562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.339105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.339189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.345481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.345601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.348003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.360034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.369246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.472028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.938016 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:06:36 crc kubenswrapper[4795]: I0320 18:06:36.406115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerStarted","Data":"7bdfe7f881d951a74ca8b66b0f91841dff449bc239ef1c2b7c679ee61596377d"} Mar 20 18:06:55 crc kubenswrapper[4795]: I0320 18:06:55.365637 4795 scope.go:117] "RemoveContainer" containerID="476381efe19b63607c82b31ec27f9dd8365ccb9f0dd4f0e61aba026eba68e6a2" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.532121 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.532597 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w88sw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(caaf60a5-8c45-4831-8d26-8cf808f1da7a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.533735 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.799530 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" Mar 20 18:07:12 crc kubenswrapper[4795]: I0320 18:07:12.996251 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:12 crc kubenswrapper[4795]: I0320 18:07:12.999753 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.007520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.117489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.117555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.118030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.219929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220653 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.254723 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.324419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.784114 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.830999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerStarted","Data":"49fa4e3d814862c4a24af506dc4bf3906d72f9f6d200a77b7c6698cc84d8440d"} Mar 20 18:07:14 crc kubenswrapper[4795]: I0320 18:07:14.841672 4795 generic.go:334] "Generic (PLEG): container finished" podID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" exitCode=0 Mar 20 18:07:14 crc kubenswrapper[4795]: I0320 18:07:14.841810 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d"} Mar 20 18:07:15 crc kubenswrapper[4795]: I0320 18:07:15.871771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerStarted","Data":"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0"} Mar 20 18:07:16 crc kubenswrapper[4795]: I0320 18:07:16.883082 4795 generic.go:334] "Generic (PLEG): container finished" podID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" exitCode=0 Mar 20 18:07:16 crc kubenswrapper[4795]: I0320 18:07:16.883134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0"} Mar 20 18:07:17 crc kubenswrapper[4795]: I0320 18:07:17.896549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerStarted","Data":"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c"} Mar 20 18:07:17 crc kubenswrapper[4795]: I0320 18:07:17.919525 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dbgpt" podStartSLOduration=3.505611374 podStartE2EDuration="5.919505261s" podCreationTimestamp="2026-03-20 18:07:12 +0000 UTC" firstStartedPulling="2026-03-20 18:07:14.843548617 +0000 UTC m=+2978.301580158" lastFinishedPulling="2026-03-20 18:07:17.257442484 +0000 UTC m=+2980.715474045" observedRunningTime="2026-03-20 18:07:17.915860398 +0000 UTC m=+2981.373891949" watchObservedRunningTime="2026-03-20 18:07:17.919505261 +0000 UTC m=+2981.377536822" Mar 20 18:07:22 crc kubenswrapper[4795]: I0320 18:07:22.730085 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.325233 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.325581 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.374296 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.955540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerStarted","Data":"ca1e86805a9f6b3f6807f721075e3f792e3f51254780ac13719c7eec007f4373"} Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.987837 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.195969788 podStartE2EDuration="49.987814711s" podCreationTimestamp="2026-03-20 18:06:34 +0000 UTC" firstStartedPulling="2026-03-20 18:06:35.934071526 +0000 UTC m=+2939.392103067" lastFinishedPulling="2026-03-20 18:07:22.725916439 +0000 UTC m=+2986.183947990" observedRunningTime="2026-03-20 18:07:23.978262814 +0000 UTC m=+2987.436294365" watchObservedRunningTime="2026-03-20 18:07:23.987814711 +0000 UTC m=+2987.445846262" Mar 20 18:07:24 crc kubenswrapper[4795]: I0320 18:07:24.033520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:24 crc kubenswrapper[4795]: I0320 18:07:24.093506 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:25 crc kubenswrapper[4795]: I0320 18:07:25.988639 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dbgpt" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" containerID="cri-o://623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" gracePeriod=2 Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.456353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.607206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.607346 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.607393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.608795 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities" (OuterVolumeSpecName: "utilities") pod "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" (UID: "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.614236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb" (OuterVolumeSpecName: "kube-api-access-hszrb") pod "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" (UID: "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7"). InnerVolumeSpecName "kube-api-access-hszrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.709682 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") on node \"crc\" DevicePath \"\"" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.709736 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.753597 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" (UID: "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.811824 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.009657 4795 generic.go:334] "Generic (PLEG): container finished" podID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" exitCode=0 Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.009743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c"} Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.010227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"49fa4e3d814862c4a24af506dc4bf3906d72f9f6d200a77b7c6698cc84d8440d"} Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.009778 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.010272 4795 scope.go:117] "RemoveContainer" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.042203 4795 scope.go:117] "RemoveContainer" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.062436 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.074188 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.086370 4795 scope.go:117] "RemoveContainer" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.125671 4795 scope.go:117] "RemoveContainer" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" Mar 20 18:07:27 crc kubenswrapper[4795]: E0320 18:07:27.126753 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c\": container with ID starting with 623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c not found: ID does not exist" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.126793 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c"} err="failed to get container status \"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c\": rpc error: code = NotFound desc = could not find container \"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c\": container with ID starting with 623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c not found: ID does not exist" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.126816 4795 scope.go:117] "RemoveContainer" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" Mar 20 18:07:27 crc kubenswrapper[4795]: E0320 18:07:27.127588 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0\": container with ID starting with ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0 not found: ID does not exist" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.127616 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0"} err="failed to get container status \"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0\": rpc error: code = NotFound desc = could not find container \"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0\": container with ID starting with ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0 not found: ID does not exist" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.127636 4795 scope.go:117] "RemoveContainer" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" Mar 20 18:07:27 crc kubenswrapper[4795]: E0320 18:07:27.128037 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d\": container with ID starting with 62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d not found: ID does not exist" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.128061 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d"} err="failed to get container status \"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d\": rpc error: code = NotFound desc = could not find container \"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d\": container with ID starting with 62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d not found: ID does not exist" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.263985 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" path="/var/lib/kubelet/pods/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7/volumes" Mar 20 18:07:41 crc kubenswrapper[4795]: I0320 18:07:41.299854 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:07:41 crc kubenswrapper[4795]: I0320 18:07:41.300582 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.146417 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:08:00 crc kubenswrapper[4795]: E0320 18:08:00.147185 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-content" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147198 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-content" Mar 20 18:08:00 crc kubenswrapper[4795]: E0320 18:08:00.147207 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-utilities" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147213 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-utilities" Mar 20 18:08:00 crc kubenswrapper[4795]: E0320 18:08:00.147228 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147234 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147412 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.155200 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.155314 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.156129 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.167461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.226827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"auto-csr-approver-29567168-m7dd2\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.328731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"auto-csr-approver-29567168-m7dd2\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.352272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"auto-csr-approver-29567168-m7dd2\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.483638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:01 crc kubenswrapper[4795]: I0320 18:08:01.030066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:08:01 crc kubenswrapper[4795]: I0320 18:08:01.386856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerStarted","Data":"5a4331e63748a9f9682d25e28b01e564fd5479e87b7c2fd8660f5fc91650c4f0"} Mar 20 18:08:02 crc kubenswrapper[4795]: I0320 18:08:02.399053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerStarted","Data":"bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f"} Mar 20 18:08:02 crc kubenswrapper[4795]: I0320 18:08:02.418381 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" podStartSLOduration=1.518303001 podStartE2EDuration="2.418366195s" podCreationTimestamp="2026-03-20 18:08:00 +0000 UTC" firstStartedPulling="2026-03-20 18:08:01.025501274 +0000 UTC m=+3024.483532845" lastFinishedPulling="2026-03-20 18:08:01.925564488 +0000 UTC m=+3025.383596039" observedRunningTime="2026-03-20 18:08:02.414850446 +0000 UTC m=+3025.872881987" watchObservedRunningTime="2026-03-20 18:08:02.418366195 +0000 UTC m=+3025.876397726" Mar 20 18:08:03 crc kubenswrapper[4795]: I0320 18:08:03.410345 4795 generic.go:334] "Generic (PLEG): container finished" podID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerID="bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f" exitCode=0 Mar 20 18:08:03 crc kubenswrapper[4795]: I0320 18:08:03.410545 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerDied","Data":"bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f"} Mar 20 18:08:04 crc kubenswrapper[4795]: I0320 18:08:04.893967 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.021574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.027787 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h" (OuterVolumeSpecName: "kube-api-access-qxm6h") pod "47914a4d-df4b-443d-b7f4-b30bfe9e7a98" (UID: "47914a4d-df4b-443d-b7f4-b30bfe9e7a98"). InnerVolumeSpecName "kube-api-access-qxm6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.123508 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.429984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerDied","Data":"5a4331e63748a9f9682d25e28b01e564fd5479e87b7c2fd8660f5fc91650c4f0"} Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.430020 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a4331e63748a9f9682d25e28b01e564fd5479e87b7c2fd8660f5fc91650c4f0" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.430028 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.488565 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.494998 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:08:07 crc kubenswrapper[4795]: I0320 18:08:07.263041 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" path="/var/lib/kubelet/pods/3f28413f-4baf-4c13-bfaa-dc76fcb80e65/volumes" Mar 20 18:08:10 crc kubenswrapper[4795]: I0320 18:08:10.501893 4795 scope.go:117] "RemoveContainer" containerID="d974ca7e16de9404ab80c37c963c39d73239baae8fb9a3246b8e5e345f171158" Mar 20 18:08:11 crc kubenswrapper[4795]: I0320 18:08:11.300042 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:08:11 crc kubenswrapper[4795]: I0320 18:08:11.300980 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.300315 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301040 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301103 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301919 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301984 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" gracePeriod=600 Mar 20 18:08:41 crc kubenswrapper[4795]: E0320 18:08:41.445349 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797002 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" exitCode=0 Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5"} Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797093 4795 scope.go:117] "RemoveContainer" containerID="4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797631 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:08:41 crc kubenswrapper[4795]: E0320 18:08:41.798037 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:08:56 crc kubenswrapper[4795]: I0320 18:08:56.252797 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:08:56 crc kubenswrapper[4795]: E0320 18:08:56.254101 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:08 crc kubenswrapper[4795]: I0320 18:09:08.252659 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:08 crc kubenswrapper[4795]: E0320 18:09:08.253930 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:20 crc kubenswrapper[4795]: I0320 18:09:20.251924 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:20 crc kubenswrapper[4795]: E0320 18:09:20.252943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:33 crc kubenswrapper[4795]: I0320 18:09:33.251930 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:33 crc kubenswrapper[4795]: E0320 18:09:33.253871 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:47 crc kubenswrapper[4795]: I0320 18:09:47.264540 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:47 crc kubenswrapper[4795]: E0320 18:09:47.265327 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.949416 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:09:49 crc kubenswrapper[4795]: E0320 18:09:49.950524 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerName="oc" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.950552 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerName="oc" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.951060 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerName="oc" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.953864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.960192 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.032416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.032490 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.032658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.133904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134385 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134946 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134970 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.162864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.290210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.904100 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.499237 4795 generic.go:334] "Generic (PLEG): container finished" podID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" exitCode=0 Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.499280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c"} Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.499316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerStarted","Data":"d3d3bfd1cfcecc4d9898b8b0e44fe867b312105b71e72d952b6aa6c41b752b2a"} Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.736281 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.738998 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.753547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.892316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.892375 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.892528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.994092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.994155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.994230 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.995234 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.995517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:52 crc kubenswrapper[4795]: I0320 18:09:52.020625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:52 crc kubenswrapper[4795]: I0320 18:09:52.080238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:52 crc kubenswrapper[4795]: I0320 18:09:52.630386 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:09:52 crc kubenswrapper[4795]: W0320 18:09:52.641847 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd2f5e2_4399_4d44_9266_a6221ff1548c.slice/crio-8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670 WatchSource:0}: Error finding container 8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670: Status 404 returned error can't find the container with id 8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670 Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.516982 4795 generic.go:334] "Generic (PLEG): container finished" podID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerID="277d3cb4b0e2d6af98c3f5ab3c45418763eb9ab970b847eeb8b262c597a914dd" exitCode=0 Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.517058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"277d3cb4b0e2d6af98c3f5ab3c45418763eb9ab970b847eeb8b262c597a914dd"} Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.517381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerStarted","Data":"8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670"} Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.519288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerStarted","Data":"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594"} Mar 20 18:09:55 crc kubenswrapper[4795]: I0320 18:09:55.544831 4795 generic.go:334] "Generic (PLEG): container finished" podID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" exitCode=0 Mar 20 18:09:55 crc kubenswrapper[4795]: I0320 18:09:55.544983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594"} Mar 20 18:09:55 crc kubenswrapper[4795]: I0320 18:09:55.550598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerStarted","Data":"0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f"} Mar 20 18:09:56 crc kubenswrapper[4795]: I0320 18:09:56.566169 4795 generic.go:334] "Generic (PLEG): container finished" podID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerID="0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f" exitCode=0 Mar 20 18:09:56 crc kubenswrapper[4795]: I0320 18:09:56.566295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f"} Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.578961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerStarted","Data":"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238"} Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.581662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerStarted","Data":"18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d"} Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.607260 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-njbls" podStartSLOduration=3.766932521 podStartE2EDuration="8.607241585s" podCreationTimestamp="2026-03-20 18:09:49 +0000 UTC" firstStartedPulling="2026-03-20 18:09:51.501355025 +0000 UTC m=+3134.959386566" lastFinishedPulling="2026-03-20 18:09:56.341664089 +0000 UTC m=+3139.799695630" observedRunningTime="2026-03-20 18:09:57.598490313 +0000 UTC m=+3141.056521864" watchObservedRunningTime="2026-03-20 18:09:57.607241585 +0000 UTC m=+3141.065273136" Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.625539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ldmj6" podStartSLOduration=3.07476179 podStartE2EDuration="6.625520502s" podCreationTimestamp="2026-03-20 18:09:51 +0000 UTC" firstStartedPulling="2026-03-20 18:09:53.518826851 +0000 UTC m=+3136.976858392" lastFinishedPulling="2026-03-20 18:09:57.069585563 +0000 UTC m=+3140.527617104" observedRunningTime="2026-03-20 18:09:57.623205421 +0000 UTC m=+3141.081236972" watchObservedRunningTime="2026-03-20 18:09:57.625520502 +0000 UTC m=+3141.083552053" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.154713 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.156370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.179821 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.180129 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.180368 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.189889 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.273609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"auto-csr-approver-29567170-nh84v\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.291293 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.291339 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.376183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"auto-csr-approver-29567170-nh84v\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.415090 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"auto-csr-approver-29567170-nh84v\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.488910 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.988958 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:10:00 crc kubenswrapper[4795]: W0320 18:10:00.989337 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode37c7446_ee1e_4fba_b0ff_4b0002aa14b4.slice/crio-95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f WatchSource:0}: Error finding container 95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f: Status 404 returned error can't find the container with id 95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f Mar 20 18:10:01 crc kubenswrapper[4795]: I0320 18:10:01.345766 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-njbls" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" probeResult="failure" output=< Mar 20 18:10:01 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:10:01 crc kubenswrapper[4795]: > Mar 20 18:10:01 crc kubenswrapper[4795]: I0320 18:10:01.654207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerStarted","Data":"95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f"} Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.081202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.081249 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.251939 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:02 crc kubenswrapper[4795]: E0320 18:10:02.252179 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.665522 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerStarted","Data":"678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c"} Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.681566 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567170-nh84v" podStartSLOduration=1.492567885 podStartE2EDuration="2.681546479s" podCreationTimestamp="2026-03-20 18:10:00 +0000 UTC" firstStartedPulling="2026-03-20 18:10:00.992358848 +0000 UTC m=+3144.450390389" lastFinishedPulling="2026-03-20 18:10:02.181337442 +0000 UTC m=+3145.639368983" observedRunningTime="2026-03-20 18:10:02.678514146 +0000 UTC m=+3146.136545697" watchObservedRunningTime="2026-03-20 18:10:02.681546479 +0000 UTC m=+3146.139578020" Mar 20 18:10:03 crc kubenswrapper[4795]: I0320 18:10:03.151553 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ldmj6" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" probeResult="failure" output=< Mar 20 18:10:03 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:10:03 crc kubenswrapper[4795]: > Mar 20 18:10:03 crc kubenswrapper[4795]: I0320 18:10:03.676304 4795 generic.go:334] "Generic (PLEG): container finished" podID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerID="678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c" exitCode=0 Mar 20 18:10:03 crc kubenswrapper[4795]: I0320 18:10:03.676352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerDied","Data":"678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c"} Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.253254 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.371899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.382984 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b" (OuterVolumeSpecName: "kube-api-access-28n6b") pod "e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" (UID: "e37c7446-ee1e-4fba-b0ff-4b0002aa14b4"). InnerVolumeSpecName "kube-api-access-28n6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.474750 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.693421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerDied","Data":"95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f"} Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.693731 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.693506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.753272 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.760961 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:10:07 crc kubenswrapper[4795]: I0320 18:10:07.266419 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" path="/var/lib/kubelet/pods/42c32f39-7999-4aa6-be6f-bdfc11c61cf8/volumes" Mar 20 18:10:10 crc kubenswrapper[4795]: I0320 18:10:10.349855 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:10 crc kubenswrapper[4795]: I0320 18:10:10.401589 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:10 crc kubenswrapper[4795]: I0320 18:10:10.677526 4795 scope.go:117] "RemoveContainer" containerID="b5132a3bc3cc2a9d3f7f72ee052f22ef62d6486d39d7ff63bd2d4ca8c43eb377" Mar 20 18:10:11 crc kubenswrapper[4795]: I0320 18:10:11.084212 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:10:11 crc kubenswrapper[4795]: I0320 18:10:11.755121 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-njbls" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" containerID="cri-o://3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" gracePeriod=2 Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.145465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.205435 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.414145 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.540019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"d60e8097-033f-426d-b3d1-c6837d4e6231\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.540138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"d60e8097-033f-426d-b3d1-c6837d4e6231\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.540226 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"d60e8097-033f-426d-b3d1-c6837d4e6231\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.541490 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities" (OuterVolumeSpecName: "utilities") pod "d60e8097-033f-426d-b3d1-c6837d4e6231" (UID: "d60e8097-033f-426d-b3d1-c6837d4e6231"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.552057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms" (OuterVolumeSpecName: "kube-api-access-t6hms") pod "d60e8097-033f-426d-b3d1-c6837d4e6231" (UID: "d60e8097-033f-426d-b3d1-c6837d4e6231"). InnerVolumeSpecName "kube-api-access-t6hms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.616640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d60e8097-033f-426d-b3d1-c6837d4e6231" (UID: "d60e8097-033f-426d-b3d1-c6837d4e6231"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.643183 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.643222 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.643236 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.769109 4795 generic.go:334] "Generic (PLEG): container finished" podID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" exitCode=0 Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.769932 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.770162 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238"} Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.770224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"d3d3bfd1cfcecc4d9898b8b0e44fe867b312105b71e72d952b6aa6c41b752b2a"} Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.770254 4795 scope.go:117] "RemoveContainer" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.813438 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.822031 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.832022 4795 scope.go:117] "RemoveContainer" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.863720 4795 scope.go:117] "RemoveContainer" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928082 4795 scope.go:117] "RemoveContainer" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" Mar 20 18:10:12 crc kubenswrapper[4795]: E0320 18:10:12.928420 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238\": container with ID starting with 3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238 not found: ID does not exist" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928605 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238"} err="failed to get container status \"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238\": rpc error: code = NotFound desc = could not find container \"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238\": container with ID starting with 3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238 not found: ID does not exist" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928629 4795 scope.go:117] "RemoveContainer" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" Mar 20 18:10:12 crc kubenswrapper[4795]: E0320 18:10:12.928975 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594\": container with ID starting with 7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594 not found: ID does not exist" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928999 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594"} err="failed to get container status \"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594\": rpc error: code = NotFound desc = could not find container \"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594\": container with ID starting with 7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594 not found: ID does not exist" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.929017 4795 scope.go:117] "RemoveContainer" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" Mar 20 18:10:12 crc kubenswrapper[4795]: E0320 18:10:12.929273 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c\": container with ID starting with 9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c not found: ID does not exist" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.929295 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c"} err="failed to get container status \"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c\": rpc error: code = NotFound desc = could not find container \"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c\": container with ID starting with 9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c not found: ID does not exist" Mar 20 18:10:13 crc kubenswrapper[4795]: I0320 18:10:13.265818 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" path="/var/lib/kubelet/pods/d60e8097-033f-426d-b3d1-c6837d4e6231/volumes" Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.482382 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.482875 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ldmj6" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" containerID="cri-o://18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d" gracePeriod=2 Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.793055 4795 generic.go:334] "Generic (PLEG): container finished" podID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerID="18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d" exitCode=0 Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.793134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d"} Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.160432 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.291225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.291294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.291320 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.294471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities" (OuterVolumeSpecName: "utilities") pod "fdd2f5e2-4399-4d44-9266-a6221ff1548c" (UID: "fdd2f5e2-4399-4d44-9266-a6221ff1548c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.300405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg" (OuterVolumeSpecName: "kube-api-access-q55qg") pod "fdd2f5e2-4399-4d44-9266-a6221ff1548c" (UID: "fdd2f5e2-4399-4d44-9266-a6221ff1548c"). InnerVolumeSpecName "kube-api-access-q55qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.393255 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.393287 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.442533 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdd2f5e2-4399-4d44-9266-a6221ff1548c" (UID: "fdd2f5e2-4399-4d44-9266-a6221ff1548c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.494410 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.803634 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670"} Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.803716 4795 scope.go:117] "RemoveContainer" containerID="18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.803856 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.836633 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.837997 4795 scope.go:117] "RemoveContainer" containerID="0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.846862 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.860356 4795 scope.go:117] "RemoveContainer" containerID="277d3cb4b0e2d6af98c3f5ab3c45418763eb9ab970b847eeb8b262c597a914dd" Mar 20 18:10:16 crc kubenswrapper[4795]: I0320 18:10:16.252729 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:16 crc kubenswrapper[4795]: E0320 18:10:16.252991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:17 crc kubenswrapper[4795]: I0320 18:10:17.269380 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" path="/var/lib/kubelet/pods/fdd2f5e2-4399-4d44-9266-a6221ff1548c/volumes" Mar 20 18:10:28 crc kubenswrapper[4795]: I0320 18:10:28.252459 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:28 crc kubenswrapper[4795]: E0320 18:10:28.254473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:40 crc kubenswrapper[4795]: I0320 18:10:40.252626 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:40 crc kubenswrapper[4795]: E0320 18:10:40.253376 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:52 crc kubenswrapper[4795]: I0320 18:10:52.252495 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:52 crc kubenswrapper[4795]: E0320 18:10:52.253440 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:03 crc kubenswrapper[4795]: I0320 18:11:03.254064 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:03 crc kubenswrapper[4795]: E0320 18:11:03.255502 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:18 crc kubenswrapper[4795]: I0320 18:11:18.252652 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:18 crc kubenswrapper[4795]: E0320 18:11:18.253515 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:33 crc kubenswrapper[4795]: I0320 18:11:33.255592 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:33 crc kubenswrapper[4795]: E0320 18:11:33.256490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:45 crc kubenswrapper[4795]: I0320 18:11:45.252224 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:45 crc kubenswrapper[4795]: E0320 18:11:45.253059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:58 crc kubenswrapper[4795]: I0320 18:11:58.252716 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:58 crc kubenswrapper[4795]: E0320 18:11:58.254529 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.155463 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156550 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156580 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156601 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156614 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156635 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156647 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156667 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerName="oc" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerName="oc" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156743 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156799 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156811 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156865 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.157186 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerName="oc" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.157235 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.157262 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.158363 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.164260 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.165004 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.165359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.170182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.261396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"auto-csr-approver-29567172-nmn82\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.363026 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"auto-csr-approver-29567172-nmn82\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.394341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"auto-csr-approver-29567172-nmn82\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.492050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:01 crc kubenswrapper[4795]: I0320 18:12:00.997731 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:12:01 crc kubenswrapper[4795]: I0320 18:12:01.006537 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:12:01 crc kubenswrapper[4795]: I0320 18:12:01.847249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-nmn82" event={"ID":"c9c43e03-d618-403e-874e-ff8337f97372","Type":"ContainerStarted","Data":"18f23700233c7b3587879479e84decfac3907e8a37c00216b2ba53a003b9b7bc"} Mar 20 18:12:02 crc kubenswrapper[4795]: I0320 18:12:02.858981 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c43e03-d618-403e-874e-ff8337f97372" containerID="09c9bfb67ac08b2ff7e4b90082be2f738cf4aa4c4b3eb977b33ed1ee55c790cb" exitCode=0 Mar 20 18:12:02 crc kubenswrapper[4795]: I0320 18:12:02.859059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-nmn82" event={"ID":"c9c43e03-d618-403e-874e-ff8337f97372","Type":"ContainerDied","Data":"09c9bfb67ac08b2ff7e4b90082be2f738cf4aa4c4b3eb977b33ed1ee55c790cb"} Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.351045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.440063 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"c9c43e03-d618-403e-874e-ff8337f97372\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.444961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n" (OuterVolumeSpecName: "kube-api-access-7w74n") pod "c9c43e03-d618-403e-874e-ff8337f97372" (UID: "c9c43e03-d618-403e-874e-ff8337f97372"). InnerVolumeSpecName "kube-api-access-7w74n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.541960 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.884084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-nmn82" event={"ID":"c9c43e03-d618-403e-874e-ff8337f97372","Type":"ContainerDied","Data":"18f23700233c7b3587879479e84decfac3907e8a37c00216b2ba53a003b9b7bc"} Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.884146 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18f23700233c7b3587879479e84decfac3907e8a37c00216b2ba53a003b9b7bc" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.884160 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:05 crc kubenswrapper[4795]: I0320 18:12:05.443743 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:12:05 crc kubenswrapper[4795]: I0320 18:12:05.455412 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:12:07 crc kubenswrapper[4795]: I0320 18:12:07.263511 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26125bad-5b31-4c3d-901b-758cb842af78" path="/var/lib/kubelet/pods/26125bad-5b31-4c3d-901b-758cb842af78/volumes" Mar 20 18:12:10 crc kubenswrapper[4795]: I0320 18:12:10.831350 4795 scope.go:117] "RemoveContainer" containerID="388b030bf3690b9a4b1c0fb962fdb64c5a18c01bd7022e209c55231177d61b95" Mar 20 18:12:12 crc kubenswrapper[4795]: I0320 18:12:12.253208 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:12 crc kubenswrapper[4795]: E0320 18:12:12.253970 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:27 crc kubenswrapper[4795]: I0320 18:12:27.265044 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:27 crc kubenswrapper[4795]: E0320 18:12:27.266249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:41 crc kubenswrapper[4795]: I0320 18:12:41.253556 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:41 crc kubenswrapper[4795]: E0320 18:12:41.254576 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:54 crc kubenswrapper[4795]: I0320 18:12:54.252013 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:54 crc kubenswrapper[4795]: E0320 18:12:54.252919 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:55 crc kubenswrapper[4795]: I0320 18:12:55.679337 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 18:12:55 crc kubenswrapper[4795]: I0320 18:12:55.679742 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 18:13:08 crc kubenswrapper[4795]: I0320 18:13:08.251701 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:08 crc kubenswrapper[4795]: E0320 18:13:08.252477 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:13:23 crc kubenswrapper[4795]: I0320 18:13:23.252758 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:23 crc kubenswrapper[4795]: E0320 18:13:23.253503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:13:36 crc kubenswrapper[4795]: I0320 18:13:36.252294 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:36 crc kubenswrapper[4795]: E0320 18:13:36.252975 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:13:51 crc kubenswrapper[4795]: I0320 18:13:51.259584 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:52 crc kubenswrapper[4795]: I0320 18:13:52.318709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d"} Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.189187 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:14:00 crc kubenswrapper[4795]: E0320 18:14:00.190083 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c43e03-d618-403e-874e-ff8337f97372" containerName="oc" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.190095 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c43e03-d618-403e-874e-ff8337f97372" containerName="oc" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.190280 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c43e03-d618-403e-874e-ff8337f97372" containerName="oc" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.190929 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.195311 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.195503 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.195652 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.214643 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.342091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"auto-csr-approver-29567174-8lkms\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.444393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"auto-csr-approver-29567174-8lkms\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.464680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"auto-csr-approver-29567174-8lkms\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.517939 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:01 crc kubenswrapper[4795]: I0320 18:14:01.000715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:14:01 crc kubenswrapper[4795]: W0320 18:14:01.008784 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd290fd42_4040_428e_8af1_8091250112e7.slice/crio-567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f WatchSource:0}: Error finding container 567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f: Status 404 returned error can't find the container with id 567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f Mar 20 18:14:01 crc kubenswrapper[4795]: I0320 18:14:01.423439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerStarted","Data":"567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f"} Mar 20 18:14:02 crc kubenswrapper[4795]: I0320 18:14:02.460294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerStarted","Data":"6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1"} Mar 20 18:14:02 crc kubenswrapper[4795]: I0320 18:14:02.484615 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567174-8lkms" podStartSLOduration=1.5877354769999998 podStartE2EDuration="2.484595445s" podCreationTimestamp="2026-03-20 18:14:00 +0000 UTC" firstStartedPulling="2026-03-20 18:14:01.011333266 +0000 UTC m=+3384.469364817" lastFinishedPulling="2026-03-20 18:14:01.908193234 +0000 UTC m=+3385.366224785" observedRunningTime="2026-03-20 18:14:02.480270201 +0000 UTC m=+3385.938301772" watchObservedRunningTime="2026-03-20 18:14:02.484595445 +0000 UTC m=+3385.942626996" Mar 20 18:14:03 crc kubenswrapper[4795]: I0320 18:14:03.472910 4795 generic.go:334] "Generic (PLEG): container finished" podID="d290fd42-4040-428e-8af1-8091250112e7" containerID="6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1" exitCode=0 Mar 20 18:14:03 crc kubenswrapper[4795]: I0320 18:14:03.473043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerDied","Data":"6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1"} Mar 20 18:14:04 crc kubenswrapper[4795]: I0320 18:14:04.961703 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.033534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"d290fd42-4040-428e-8af1-8091250112e7\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.056900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh" (OuterVolumeSpecName: "kube-api-access-cz6fh") pod "d290fd42-4040-428e-8af1-8091250112e7" (UID: "d290fd42-4040-428e-8af1-8091250112e7"). InnerVolumeSpecName "kube-api-access-cz6fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.136180 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.497887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerDied","Data":"567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f"} Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.498080 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.497968 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.572327 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.580619 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:14:07 crc kubenswrapper[4795]: I0320 18:14:07.264411 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" path="/var/lib/kubelet/pods/47914a4d-df4b-443d-b7f4-b30bfe9e7a98/volumes" Mar 20 18:14:10 crc kubenswrapper[4795]: I0320 18:14:10.973450 4795 scope.go:117] "RemoveContainer" containerID="bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.168827 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd"] Mar 20 18:15:00 crc kubenswrapper[4795]: E0320 18:15:00.177334 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d290fd42-4040-428e-8af1-8091250112e7" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.177364 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d290fd42-4040-428e-8af1-8091250112e7" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.177560 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d290fd42-4040-428e-8af1-8091250112e7" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.178138 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd"] Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.178219 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.183096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.183227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.304172 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.304561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.304710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.406140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.406217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.406290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.407254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.413801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.439792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.514281 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.966358 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd"] Mar 20 18:15:01 crc kubenswrapper[4795]: I0320 18:15:01.065110 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" event={"ID":"6419bd80-fcd7-4b70-bb03-c8a97aa4da93","Type":"ContainerStarted","Data":"8148282fa276f46ca897b62036478449efe04e103762f682938e37d456ce8f61"} Mar 20 18:15:02 crc kubenswrapper[4795]: I0320 18:15:02.079096 4795 generic.go:334] "Generic (PLEG): container finished" podID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerID="f4bbde7b605a50344cd144b9443257804c13dc6048d563bf2b8b2bd90820631f" exitCode=0 Mar 20 18:15:02 crc kubenswrapper[4795]: I0320 18:15:02.079407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" event={"ID":"6419bd80-fcd7-4b70-bb03-c8a97aa4da93","Type":"ContainerDied","Data":"f4bbde7b605a50344cd144b9443257804c13dc6048d563bf2b8b2bd90820631f"} Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.676196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.878591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.878822 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.878858 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.879437 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume" (OuterVolumeSpecName: "config-volume") pod "6419bd80-fcd7-4b70-bb03-c8a97aa4da93" (UID: "6419bd80-fcd7-4b70-bb03-c8a97aa4da93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.884799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9" (OuterVolumeSpecName: "kube-api-access-27jq9") pod "6419bd80-fcd7-4b70-bb03-c8a97aa4da93" (UID: "6419bd80-fcd7-4b70-bb03-c8a97aa4da93"). InnerVolumeSpecName "kube-api-access-27jq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.884969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6419bd80-fcd7-4b70-bb03-c8a97aa4da93" (UID: "6419bd80-fcd7-4b70-bb03-c8a97aa4da93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.981593 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.981628 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.981638 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.106393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" event={"ID":"6419bd80-fcd7-4b70-bb03-c8a97aa4da93","Type":"ContainerDied","Data":"8148282fa276f46ca897b62036478449efe04e103762f682938e37d456ce8f61"} Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.106454 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8148282fa276f46ca897b62036478449efe04e103762f682938e37d456ce8f61" Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.106476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.771897 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.780898 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 18:15:05 crc kubenswrapper[4795]: I0320 18:15:05.279026 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" path="/var/lib/kubelet/pods/a6892589-ca9a-45cc-8991-ab0029e67e3c/volumes" Mar 20 18:15:11 crc kubenswrapper[4795]: I0320 18:15:11.070190 4795 scope.go:117] "RemoveContainer" containerID="c957ead85ece246848e605f8f78734d00ae750bd985db9f200ae787909bd1425" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.158922 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:16:00 crc kubenswrapper[4795]: E0320 18:16:00.159942 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.159957 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.160172 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.160826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.163912 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.164102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.164111 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.176585 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.261001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"auto-csr-approver-29567176-jlvh2\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.363012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"auto-csr-approver-29567176-jlvh2\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.393357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"auto-csr-approver-29567176-jlvh2\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.485959 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.978614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:16:01 crc kubenswrapper[4795]: W0320 18:16:01.004316 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6847a127_2563_4611_aa3c_5de097af7485.slice/crio-8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725 WatchSource:0}: Error finding container 8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725: Status 404 returned error can't find the container with id 8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725 Mar 20 18:16:01 crc kubenswrapper[4795]: I0320 18:16:01.730340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" event={"ID":"6847a127-2563-4611-aa3c-5de097af7485","Type":"ContainerStarted","Data":"8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725"} Mar 20 18:16:02 crc kubenswrapper[4795]: I0320 18:16:02.747012 4795 generic.go:334] "Generic (PLEG): container finished" podID="6847a127-2563-4611-aa3c-5de097af7485" containerID="fcb0c54c2a527f381862afe1aaeeba3ced38b835a91522600710892ac634473c" exitCode=0 Mar 20 18:16:02 crc kubenswrapper[4795]: I0320 18:16:02.747121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" event={"ID":"6847a127-2563-4611-aa3c-5de097af7485","Type":"ContainerDied","Data":"fcb0c54c2a527f381862afe1aaeeba3ced38b835a91522600710892ac634473c"} Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.313575 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.353749 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"6847a127-2563-4611-aa3c-5de097af7485\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.362050 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg" (OuterVolumeSpecName: "kube-api-access-xz2tg") pod "6847a127-2563-4611-aa3c-5de097af7485" (UID: "6847a127-2563-4611-aa3c-5de097af7485"). InnerVolumeSpecName "kube-api-access-xz2tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.456087 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") on node \"crc\" DevicePath \"\"" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.771861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" event={"ID":"6847a127-2563-4611-aa3c-5de097af7485","Type":"ContainerDied","Data":"8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725"} Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.771902 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.771907 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725" Mar 20 18:16:05 crc kubenswrapper[4795]: I0320 18:16:05.399480 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:16:05 crc kubenswrapper[4795]: I0320 18:16:05.410104 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:16:07 crc kubenswrapper[4795]: I0320 18:16:07.270834 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" path="/var/lib/kubelet/pods/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4/volumes" Mar 20 18:16:11 crc kubenswrapper[4795]: I0320 18:16:11.131632 4795 scope.go:117] "RemoveContainer" containerID="678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c" Mar 20 18:16:11 crc kubenswrapper[4795]: I0320 18:16:11.300397 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:16:11 crc kubenswrapper[4795]: I0320 18:16:11.300473 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:16:41 crc kubenswrapper[4795]: I0320 18:16:41.299991 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:16:41 crc kubenswrapper[4795]: I0320 18:16:41.300771 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.777470 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:10 crc kubenswrapper[4795]: E0320 18:17:10.778617 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6847a127-2563-4611-aa3c-5de097af7485" containerName="oc" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.778635 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6847a127-2563-4611-aa3c-5de097af7485" containerName="oc" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.779020 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6847a127-2563-4611-aa3c-5de097af7485" containerName="oc" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.780766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.803147 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.874810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.874880 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.874942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.977478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.977606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.977770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.978094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.978170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.003619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.115798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.301980 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.302283 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.302332 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.303123 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.303181 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d" gracePeriod=600 Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.519038 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d" exitCode=0 Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.519078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d"} Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.519110 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:17:11 crc kubenswrapper[4795]: W0320 18:17:11.604092 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod693b9900_2584_45d2_9cf6_9bc22c3c010c.slice/crio-a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105 WatchSource:0}: Error finding container a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105: Status 404 returned error can't find the container with id a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105 Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.604542 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.534839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8"} Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.539115 4795 generic.go:334] "Generic (PLEG): container finished" podID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" exitCode=0 Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.539189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e"} Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.539236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerStarted","Data":"a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105"} Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.542986 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:17:13 crc kubenswrapper[4795]: I0320 18:17:13.553678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerStarted","Data":"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377"} Mar 20 18:17:15 crc kubenswrapper[4795]: I0320 18:17:15.568849 4795 generic.go:334] "Generic (PLEG): container finished" podID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" exitCode=0 Mar 20 18:17:15 crc kubenswrapper[4795]: I0320 18:17:15.568927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377"} Mar 20 18:17:16 crc kubenswrapper[4795]: I0320 18:17:16.583566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerStarted","Data":"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049"} Mar 20 18:17:16 crc kubenswrapper[4795]: I0320 18:17:16.605775 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fvx6c" podStartSLOduration=3.15985909 podStartE2EDuration="6.605759674s" podCreationTimestamp="2026-03-20 18:17:10 +0000 UTC" firstStartedPulling="2026-03-20 18:17:12.542492025 +0000 UTC m=+3576.000523606" lastFinishedPulling="2026-03-20 18:17:15.988392649 +0000 UTC m=+3579.446424190" observedRunningTime="2026-03-20 18:17:16.602344087 +0000 UTC m=+3580.060375628" watchObservedRunningTime="2026-03-20 18:17:16.605759674 +0000 UTC m=+3580.063791215" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.116681 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.117253 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.178965 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.679788 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.724797 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:23 crc kubenswrapper[4795]: I0320 18:17:23.667462 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fvx6c" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" containerID="cri-o://c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" gracePeriod=2 Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.248648 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.355148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"693b9900-2584-45d2-9cf6-9bc22c3c010c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.355985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"693b9900-2584-45d2-9cf6-9bc22c3c010c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.356088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"693b9900-2584-45d2-9cf6-9bc22c3c010c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.358169 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities" (OuterVolumeSpecName: "utilities") pod "693b9900-2584-45d2-9cf6-9bc22c3c010c" (UID: "693b9900-2584-45d2-9cf6-9bc22c3c010c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.367753 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v" (OuterVolumeSpecName: "kube-api-access-7lt6v") pod "693b9900-2584-45d2-9cf6-9bc22c3c010c" (UID: "693b9900-2584-45d2-9cf6-9bc22c3c010c"). InnerVolumeSpecName "kube-api-access-7lt6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.422051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "693b9900-2584-45d2-9cf6-9bc22c3c010c" (UID: "693b9900-2584-45d2-9cf6-9bc22c3c010c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.458052 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.458087 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.458097 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") on node \"crc\" DevicePath \"\"" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.680800 4795 generic.go:334] "Generic (PLEG): container finished" podID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" exitCode=0 Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.680839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049"} Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.681156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105"} Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.680923 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.681177 4795 scope.go:117] "RemoveContainer" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.718639 4795 scope.go:117] "RemoveContainer" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.725298 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.734928 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.752184 4795 scope.go:117] "RemoveContainer" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.807406 4795 scope.go:117] "RemoveContainer" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" Mar 20 18:17:24 crc kubenswrapper[4795]: E0320 18:17:24.807975 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049\": container with ID starting with c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049 not found: ID does not exist" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808011 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049"} err="failed to get container status \"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049\": rpc error: code = NotFound desc = could not find container \"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049\": container with ID starting with c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049 not found: ID does not exist" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808038 4795 scope.go:117] "RemoveContainer" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" Mar 20 18:17:24 crc kubenswrapper[4795]: E0320 18:17:24.808521 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377\": container with ID starting with 0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377 not found: ID does not exist" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808587 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377"} err="failed to get container status \"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377\": rpc error: code = NotFound desc = could not find container \"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377\": container with ID starting with 0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377 not found: ID does not exist" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808628 4795 scope.go:117] "RemoveContainer" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" Mar 20 18:17:24 crc kubenswrapper[4795]: E0320 18:17:24.809089 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e\": container with ID starting with f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e not found: ID does not exist" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.809121 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e"} err="failed to get container status \"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e\": rpc error: code = NotFound desc = could not find container \"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e\": container with ID starting with f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e not found: ID does not exist" Mar 20 18:17:25 crc kubenswrapper[4795]: I0320 18:17:25.273984 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" path="/var/lib/kubelet/pods/693b9900-2584-45d2-9cf6-9bc22c3c010c/volumes" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.180378 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:18:00 crc kubenswrapper[4795]: E0320 18:18:00.181221 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-content" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181234 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-content" Mar 20 18:18:00 crc kubenswrapper[4795]: E0320 18:18:00.181253 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-utilities" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-utilities" Mar 20 18:18:00 crc kubenswrapper[4795]: E0320 18:18:00.181276 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181282 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181458 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.182097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.186895 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.187377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.187972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.210123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.290735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"auto-csr-approver-29567178-vbj6l\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.393316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"auto-csr-approver-29567178-vbj6l\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.434598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"auto-csr-approver-29567178-vbj6l\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.505199 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:01 crc kubenswrapper[4795]: I0320 18:18:00.999271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:18:01 crc kubenswrapper[4795]: I0320 18:18:01.113523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" event={"ID":"cf1bb697-899c-48fe-984a-61258e78cd87","Type":"ContainerStarted","Data":"9dbe8e0a3fbec037daa28087ca926302146d2f153b7efd96de4164b0d50b1132"} Mar 20 18:18:03 crc kubenswrapper[4795]: I0320 18:18:03.142218 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf1bb697-899c-48fe-984a-61258e78cd87" containerID="78ef24a78e7dcac7e46de63dc467ed76c00bbe2831c0a8a33ad6b914782524d5" exitCode=0 Mar 20 18:18:03 crc kubenswrapper[4795]: I0320 18:18:03.142316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" event={"ID":"cf1bb697-899c-48fe-984a-61258e78cd87","Type":"ContainerDied","Data":"78ef24a78e7dcac7e46de63dc467ed76c00bbe2831c0a8a33ad6b914782524d5"} Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.613291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.778527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"cf1bb697-899c-48fe-984a-61258e78cd87\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.785931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6" (OuterVolumeSpecName: "kube-api-access-fhcj6") pod "cf1bb697-899c-48fe-984a-61258e78cd87" (UID: "cf1bb697-899c-48fe-984a-61258e78cd87"). InnerVolumeSpecName "kube-api-access-fhcj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.882210 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.167633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" event={"ID":"cf1bb697-899c-48fe-984a-61258e78cd87","Type":"ContainerDied","Data":"9dbe8e0a3fbec037daa28087ca926302146d2f153b7efd96de4164b0d50b1132"} Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.167762 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dbe8e0a3fbec037daa28087ca926302146d2f153b7efd96de4164b0d50b1132" Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.168120 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.720914 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.731606 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:18:07 crc kubenswrapper[4795]: I0320 18:18:07.265329 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c43e03-d618-403e-874e-ff8337f97372" path="/var/lib/kubelet/pods/c9c43e03-d618-403e-874e-ff8337f97372/volumes" Mar 20 18:18:11 crc kubenswrapper[4795]: I0320 18:18:11.222559 4795 scope.go:117] "RemoveContainer" containerID="09c9bfb67ac08b2ff7e4b90082be2f738cf4aa4c4b3eb977b33ed1ee55c790cb" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.959714 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:34 crc kubenswrapper[4795]: E0320 18:18:34.960682 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" containerName="oc" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.960713 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" containerName="oc" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.960929 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" containerName="oc" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.962677 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.984059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.098561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.098644 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.098886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201986 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.202207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.231853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.289824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.816759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:35 crc kubenswrapper[4795]: W0320 18:18:35.818152 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod282a97b6_2e41_4e26_a941_77a9c94206cb.slice/crio-8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a WatchSource:0}: Error finding container 8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a: Status 404 returned error can't find the container with id 8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a Mar 20 18:18:36 crc kubenswrapper[4795]: I0320 18:18:36.522122 4795 generic.go:334] "Generic (PLEG): container finished" podID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" exitCode=0 Mar 20 18:18:36 crc kubenswrapper[4795]: I0320 18:18:36.522241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c"} Mar 20 18:18:36 crc kubenswrapper[4795]: I0320 18:18:36.522506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerStarted","Data":"8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a"} Mar 20 18:18:37 crc kubenswrapper[4795]: I0320 18:18:37.539436 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerStarted","Data":"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5"} Mar 20 18:18:38 crc kubenswrapper[4795]: I0320 18:18:38.577969 4795 generic.go:334] "Generic (PLEG): container finished" podID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" exitCode=0 Mar 20 18:18:38 crc kubenswrapper[4795]: I0320 18:18:38.578027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5"} Mar 20 18:18:39 crc kubenswrapper[4795]: I0320 18:18:39.591002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerStarted","Data":"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4"} Mar 20 18:18:39 crc kubenswrapper[4795]: I0320 18:18:39.619785 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-99xjf" podStartSLOduration=2.990488324 podStartE2EDuration="5.619761594s" podCreationTimestamp="2026-03-20 18:18:34 +0000 UTC" firstStartedPulling="2026-03-20 18:18:36.524653009 +0000 UTC m=+3659.982684550" lastFinishedPulling="2026-03-20 18:18:39.153926289 +0000 UTC m=+3662.611957820" observedRunningTime="2026-03-20 18:18:39.609086962 +0000 UTC m=+3663.067118543" watchObservedRunningTime="2026-03-20 18:18:39.619761594 +0000 UTC m=+3663.077793145" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.290849 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.291549 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.348325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.692959 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.745547 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:47 crc kubenswrapper[4795]: I0320 18:18:47.670783 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-99xjf" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" containerID="cri-o://b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" gracePeriod=2 Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.235023 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.373367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"282a97b6-2e41-4e26-a941-77a9c94206cb\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.373422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"282a97b6-2e41-4e26-a941-77a9c94206cb\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.373626 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"282a97b6-2e41-4e26-a941-77a9c94206cb\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.374715 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities" (OuterVolumeSpecName: "utilities") pod "282a97b6-2e41-4e26-a941-77a9c94206cb" (UID: "282a97b6-2e41-4e26-a941-77a9c94206cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.384902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82" (OuterVolumeSpecName: "kube-api-access-k4r82") pod "282a97b6-2e41-4e26-a941-77a9c94206cb" (UID: "282a97b6-2e41-4e26-a941-77a9c94206cb"). InnerVolumeSpecName "kube-api-access-k4r82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.404470 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "282a97b6-2e41-4e26-a941-77a9c94206cb" (UID: "282a97b6-2e41-4e26-a941-77a9c94206cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.475739 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.475774 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.475785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681752 4795 generic.go:334] "Generic (PLEG): container finished" podID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" exitCode=0 Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4"} Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a"} Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681868 4795 scope.go:117] "RemoveContainer" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.682005 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.723343 4795 scope.go:117] "RemoveContainer" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.726358 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.733159 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.744524 4795 scope.go:117] "RemoveContainer" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.819417 4795 scope.go:117] "RemoveContainer" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" Mar 20 18:18:48 crc kubenswrapper[4795]: E0320 18:18:48.820051 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4\": container with ID starting with b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4 not found: ID does not exist" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820104 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4"} err="failed to get container status \"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4\": rpc error: code = NotFound desc = could not find container \"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4\": container with ID starting with b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4 not found: ID does not exist" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820146 4795 scope.go:117] "RemoveContainer" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" Mar 20 18:18:48 crc kubenswrapper[4795]: E0320 18:18:48.820500 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5\": container with ID starting with eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5 not found: ID does not exist" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820539 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5"} err="failed to get container status \"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5\": rpc error: code = NotFound desc = could not find container \"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5\": container with ID starting with eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5 not found: ID does not exist" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820564 4795 scope.go:117] "RemoveContainer" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" Mar 20 18:18:48 crc kubenswrapper[4795]: E0320 18:18:48.820885 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c\": container with ID starting with b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c not found: ID does not exist" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820929 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c"} err="failed to get container status \"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c\": rpc error: code = NotFound desc = could not find container \"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c\": container with ID starting with b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c not found: ID does not exist" Mar 20 18:18:49 crc kubenswrapper[4795]: I0320 18:18:49.278345 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" path="/var/lib/kubelet/pods/282a97b6-2e41-4e26-a941-77a9c94206cb/volumes" Mar 20 18:19:11 crc kubenswrapper[4795]: I0320 18:19:11.300524 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:19:11 crc kubenswrapper[4795]: I0320 18:19:11.301256 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:19:41 crc kubenswrapper[4795]: I0320 18:19:41.301087 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:19:41 crc kubenswrapper[4795]: I0320 18:19:41.301938 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.145904 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:20:00 crc kubenswrapper[4795]: E0320 18:20:00.146794 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-content" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.146809 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-content" Mar 20 18:20:00 crc kubenswrapper[4795]: E0320 18:20:00.146831 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.146837 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" Mar 20 18:20:00 crc kubenswrapper[4795]: E0320 18:20:00.146854 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-utilities" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.146861 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-utilities" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.147043 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.147565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.151025 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.151297 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.153262 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.154058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.248493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"auto-csr-approver-29567180-hdwfg\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.350711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"auto-csr-approver-29567180-hdwfg\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.369321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"auto-csr-approver-29567180-hdwfg\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.467780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.937104 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:20:01 crc kubenswrapper[4795]: I0320 18:20:01.454491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerStarted","Data":"adde32c54b8aabda2e836b8d23281371028b7d7d10f74d51b930d44f71cc947d"} Mar 20 18:20:02 crc kubenswrapper[4795]: I0320 18:20:02.465908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerStarted","Data":"fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf"} Mar 20 18:20:02 crc kubenswrapper[4795]: I0320 18:20:02.499014 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" podStartSLOduration=1.568468051 podStartE2EDuration="2.498976416s" podCreationTimestamp="2026-03-20 18:20:00 +0000 UTC" firstStartedPulling="2026-03-20 18:20:00.933555343 +0000 UTC m=+3744.391586894" lastFinishedPulling="2026-03-20 18:20:01.864063718 +0000 UTC m=+3745.322095259" observedRunningTime="2026-03-20 18:20:02.484085823 +0000 UTC m=+3745.942117374" watchObservedRunningTime="2026-03-20 18:20:02.498976416 +0000 UTC m=+3745.957007997" Mar 20 18:20:03 crc kubenswrapper[4795]: I0320 18:20:03.477609 4795 generic.go:334] "Generic (PLEG): container finished" podID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerID="fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf" exitCode=0 Mar 20 18:20:03 crc kubenswrapper[4795]: I0320 18:20:03.477715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerDied","Data":"fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf"} Mar 20 18:20:04 crc kubenswrapper[4795]: I0320 18:20:04.967912 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.158325 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.164236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw" (OuterVolumeSpecName: "kube-api-access-5l6sw") pod "78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" (UID: "78d7fab6-a6ea-4dee-bd81-84a6cfb81aec"). InnerVolumeSpecName "kube-api-access-5l6sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.260002 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") on node \"crc\" DevicePath \"\"" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.534174 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerDied","Data":"adde32c54b8aabda2e836b8d23281371028b7d7d10f74d51b930d44f71cc947d"} Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.534763 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adde32c54b8aabda2e836b8d23281371028b7d7d10f74d51b930d44f71cc947d" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.534943 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.577958 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.586511 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:20:07 crc kubenswrapper[4795]: I0320 18:20:07.270393 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d290fd42-4040-428e-8af1-8091250112e7" path="/var/lib/kubelet/pods/d290fd42-4040-428e-8af1-8091250112e7/volumes" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.300481 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.301099 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.301160 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.302578 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.302669 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" gracePeriod=600 Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.366917 4795 scope.go:117] "RemoveContainer" containerID="6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1" Mar 20 18:20:11 crc kubenswrapper[4795]: E0320 18:20:11.461839 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.606823 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" exitCode=0 Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.606885 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8"} Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.606934 4795 scope.go:117] "RemoveContainer" containerID="a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.607839 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:11 crc kubenswrapper[4795]: E0320 18:20:11.608386 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:25 crc kubenswrapper[4795]: I0320 18:20:25.252631 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:25 crc kubenswrapper[4795]: E0320 18:20:25.253375 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:40 crc kubenswrapper[4795]: I0320 18:20:40.252232 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:40 crc kubenswrapper[4795]: E0320 18:20:40.253434 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:55 crc kubenswrapper[4795]: I0320 18:20:55.252785 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:55 crc kubenswrapper[4795]: E0320 18:20:55.253598 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:08 crc kubenswrapper[4795]: I0320 18:21:08.252027 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:08 crc kubenswrapper[4795]: E0320 18:21:08.253059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:23 crc kubenswrapper[4795]: I0320 18:21:23.252712 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:23 crc kubenswrapper[4795]: E0320 18:21:23.253376 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:38 crc kubenswrapper[4795]: I0320 18:21:38.252792 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:38 crc kubenswrapper[4795]: E0320 18:21:38.253979 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:53 crc kubenswrapper[4795]: I0320 18:21:53.253124 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:53 crc kubenswrapper[4795]: E0320 18:21:53.254388 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.153089 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:22:00 crc kubenswrapper[4795]: E0320 18:22:00.154173 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.154189 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.154415 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.155143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.162351 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.162386 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.162650 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.174808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.278438 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"auto-csr-approver-29567182-5tldr\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.381114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"auto-csr-approver-29567182-5tldr\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.401170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"auto-csr-approver-29567182-5tldr\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.481731 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.926749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:22:01 crc kubenswrapper[4795]: I0320 18:22:01.748431 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-5tldr" event={"ID":"478a3729-f417-4458-b0c5-562ed9c72252","Type":"ContainerStarted","Data":"0c73ef664439c6cd1a470ac299c5d4845b2d54a38cb2b91cfd6d7b76f6d9c6d7"} Mar 20 18:22:02 crc kubenswrapper[4795]: E0320 18:22:02.613577 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478a3729_f417_4458_b0c5_562ed9c72252.slice/crio-conmon-693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478a3729_f417_4458_b0c5_562ed9c72252.slice/crio-693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b.scope\": RecentStats: unable to find data in memory cache]" Mar 20 18:22:02 crc kubenswrapper[4795]: I0320 18:22:02.766671 4795 generic.go:334] "Generic (PLEG): container finished" podID="478a3729-f417-4458-b0c5-562ed9c72252" containerID="693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b" exitCode=0 Mar 20 18:22:02 crc kubenswrapper[4795]: I0320 18:22:02.766768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-5tldr" event={"ID":"478a3729-f417-4458-b0c5-562ed9c72252","Type":"ContainerDied","Data":"693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b"} Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.273072 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.354917 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"478a3729-f417-4458-b0c5-562ed9c72252\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.361530 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq" (OuterVolumeSpecName: "kube-api-access-6pwkq") pod "478a3729-f417-4458-b0c5-562ed9c72252" (UID: "478a3729-f417-4458-b0c5-562ed9c72252"). InnerVolumeSpecName "kube-api-access-6pwkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.457257 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.789738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-5tldr" event={"ID":"478a3729-f417-4458-b0c5-562ed9c72252","Type":"ContainerDied","Data":"0c73ef664439c6cd1a470ac299c5d4845b2d54a38cb2b91cfd6d7b76f6d9c6d7"} Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.789784 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c73ef664439c6cd1a470ac299c5d4845b2d54a38cb2b91cfd6d7b76f6d9c6d7" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.789844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:05 crc kubenswrapper[4795]: I0320 18:22:05.412003 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:22:05 crc kubenswrapper[4795]: I0320 18:22:05.429942 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:22:07 crc kubenswrapper[4795]: I0320 18:22:07.264741 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:07 crc kubenswrapper[4795]: I0320 18:22:07.264959 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6847a127-2563-4611-aa3c-5de097af7485" path="/var/lib/kubelet/pods/6847a127-2563-4611-aa3c-5de097af7485/volumes" Mar 20 18:22:07 crc kubenswrapper[4795]: E0320 18:22:07.265043 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:11 crc kubenswrapper[4795]: I0320 18:22:11.494423 4795 scope.go:117] "RemoveContainer" containerID="fcb0c54c2a527f381862afe1aaeeba3ced38b835a91522600710892ac634473c" Mar 20 18:22:20 crc kubenswrapper[4795]: I0320 18:22:20.251783 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:20 crc kubenswrapper[4795]: E0320 18:22:20.252526 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:33 crc kubenswrapper[4795]: I0320 18:22:33.255130 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:33 crc kubenswrapper[4795]: E0320 18:22:33.260237 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:44 crc kubenswrapper[4795]: I0320 18:22:44.252002 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:44 crc kubenswrapper[4795]: E0320 18:22:44.253053 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:59 crc kubenswrapper[4795]: I0320 18:22:59.256118 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:59 crc kubenswrapper[4795]: E0320 18:22:59.257082 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:14 crc kubenswrapper[4795]: I0320 18:23:14.251613 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:14 crc kubenswrapper[4795]: E0320 18:23:14.252326 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:29 crc kubenswrapper[4795]: I0320 18:23:29.253508 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:29 crc kubenswrapper[4795]: E0320 18:23:29.254344 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:44 crc kubenswrapper[4795]: I0320 18:23:44.253138 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:44 crc kubenswrapper[4795]: E0320 18:23:44.254447 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:58 crc kubenswrapper[4795]: I0320 18:23:58.252514 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:58 crc kubenswrapper[4795]: E0320 18:23:58.253356 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.151708 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:24:00 crc kubenswrapper[4795]: E0320 18:24:00.152536 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478a3729-f417-4458-b0c5-562ed9c72252" containerName="oc" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.152553 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="478a3729-f417-4458-b0c5-562ed9c72252" containerName="oc" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.152848 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="478a3729-f417-4458-b0c5-562ed9c72252" containerName="oc" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.153629 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.157186 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.157618 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.159249 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.164436 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.235343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"auto-csr-approver-29567184-gzns5\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.337026 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"auto-csr-approver-29567184-gzns5\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.355444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"auto-csr-approver-29567184-gzns5\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.475356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.928658 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:24:00 crc kubenswrapper[4795]: W0320 18:24:00.937253 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod133e7bff_461c_4450_bf3b_8d43791045a4.slice/crio-4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852 WatchSource:0}: Error finding container 4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852: Status 404 returned error can't find the container with id 4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852 Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.940598 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:24:01 crc kubenswrapper[4795]: I0320 18:24:01.944975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-gzns5" event={"ID":"133e7bff-461c-4450-bf3b-8d43791045a4","Type":"ContainerStarted","Data":"4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852"} Mar 20 18:24:02 crc kubenswrapper[4795]: I0320 18:24:02.965193 4795 generic.go:334] "Generic (PLEG): container finished" podID="133e7bff-461c-4450-bf3b-8d43791045a4" containerID="ccece14cbf4b4c8c9889d9ebad0a41bd1c87c88349be10c251ccd8a08eb4cac4" exitCode=0 Mar 20 18:24:02 crc kubenswrapper[4795]: I0320 18:24:02.965262 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-gzns5" event={"ID":"133e7bff-461c-4450-bf3b-8d43791045a4","Type":"ContainerDied","Data":"ccece14cbf4b4c8c9889d9ebad0a41bd1c87c88349be10c251ccd8a08eb4cac4"} Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.401625 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.512762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"133e7bff-461c-4450-bf3b-8d43791045a4\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.523007 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq" (OuterVolumeSpecName: "kube-api-access-qzfmq") pod "133e7bff-461c-4450-bf3b-8d43791045a4" (UID: "133e7bff-461c-4450-bf3b-8d43791045a4"). InnerVolumeSpecName "kube-api-access-qzfmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.614943 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") on node \"crc\" DevicePath \"\"" Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.003459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-gzns5" event={"ID":"133e7bff-461c-4450-bf3b-8d43791045a4","Type":"ContainerDied","Data":"4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852"} Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.003496 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852" Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.004016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.484201 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.493805 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:24:07 crc kubenswrapper[4795]: I0320 18:24:07.268147 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" path="/var/lib/kubelet/pods/cf1bb697-899c-48fe-984a-61258e78cd87/volumes" Mar 20 18:24:09 crc kubenswrapper[4795]: I0320 18:24:09.256440 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:09 crc kubenswrapper[4795]: E0320 18:24:09.257113 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:11 crc kubenswrapper[4795]: I0320 18:24:11.617393 4795 scope.go:117] "RemoveContainer" containerID="78ef24a78e7dcac7e46de63dc467ed76c00bbe2831c0a8a33ad6b914782524d5" Mar 20 18:24:21 crc kubenswrapper[4795]: I0320 18:24:21.251999 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:21 crc kubenswrapper[4795]: E0320 18:24:21.253521 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:36 crc kubenswrapper[4795]: I0320 18:24:36.251928 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:36 crc kubenswrapper[4795]: E0320 18:24:36.252783 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:50 crc kubenswrapper[4795]: I0320 18:24:50.252958 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:50 crc kubenswrapper[4795]: E0320 18:24:50.253869 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.252342 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:25:04 crc kubenswrapper[4795]: E0320 18:25:04.253222 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.567157 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:04 crc kubenswrapper[4795]: E0320 18:25:04.567505 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" containerName="oc" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.567522 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" containerName="oc" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.567746 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" containerName="oc" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.568997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.595847 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.672195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.672412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.672465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.773725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.773880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.773924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.774354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.774450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.792139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.909109 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:05 crc kubenswrapper[4795]: W0320 18:25:05.389013 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5534910c_2643_4a80_8d50_82267e2567e0.slice/crio-2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82 WatchSource:0}: Error finding container 2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82: Status 404 returned error can't find the container with id 2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82 Mar 20 18:25:05 crc kubenswrapper[4795]: I0320 18:25:05.391373 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:05 crc kubenswrapper[4795]: I0320 18:25:05.544383 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerStarted","Data":"2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82"} Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.368402 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.371600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.382587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.504560 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.504639 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.504740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.552475 4795 generic.go:334] "Generic (PLEG): container finished" podID="5534910c-2643-4a80-8d50-82267e2567e0" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" exitCode=0 Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.552513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6"} Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.606894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.606964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.607025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.607768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.607777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.632954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.691189 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.219283 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.565118 4795 generic.go:334] "Generic (PLEG): container finished" podID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" exitCode=0 Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.565520 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1"} Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.565568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerStarted","Data":"0988077d812aa21cebf551a9213e7cdabc69aef5852f2e0a86b4777259c6f40a"} Mar 20 18:25:08 crc kubenswrapper[4795]: I0320 18:25:08.593355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerStarted","Data":"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1"} Mar 20 18:25:08 crc kubenswrapper[4795]: I0320 18:25:08.596080 4795 generic.go:334] "Generic (PLEG): container finished" podID="5534910c-2643-4a80-8d50-82267e2567e0" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" exitCode=0 Mar 20 18:25:08 crc kubenswrapper[4795]: I0320 18:25:08.596115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4"} Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.606520 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerStarted","Data":"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a"} Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.608961 4795 generic.go:334] "Generic (PLEG): container finished" podID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" exitCode=0 Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.609016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1"} Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.634060 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s45dr" podStartSLOduration=3.19629929 podStartE2EDuration="5.634036269s" podCreationTimestamp="2026-03-20 18:25:04 +0000 UTC" firstStartedPulling="2026-03-20 18:25:06.553993886 +0000 UTC m=+4050.012025427" lastFinishedPulling="2026-03-20 18:25:08.991730875 +0000 UTC m=+4052.449762406" observedRunningTime="2026-03-20 18:25:09.626888117 +0000 UTC m=+4053.084919658" watchObservedRunningTime="2026-03-20 18:25:09.634036269 +0000 UTC m=+4053.092067820" Mar 20 18:25:10 crc kubenswrapper[4795]: I0320 18:25:10.619283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerStarted","Data":"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8"} Mar 20 18:25:10 crc kubenswrapper[4795]: I0320 18:25:10.644558 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8s8nl" podStartSLOduration=1.967046352 podStartE2EDuration="4.644536491s" podCreationTimestamp="2026-03-20 18:25:06 +0000 UTC" firstStartedPulling="2026-03-20 18:25:07.567796171 +0000 UTC m=+4051.025827752" lastFinishedPulling="2026-03-20 18:25:10.24528634 +0000 UTC m=+4053.703317891" observedRunningTime="2026-03-20 18:25:10.641024701 +0000 UTC m=+4054.099056262" watchObservedRunningTime="2026-03-20 18:25:10.644536491 +0000 UTC m=+4054.102568042" Mar 20 18:25:14 crc kubenswrapper[4795]: I0320 18:25:14.909948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:14 crc kubenswrapper[4795]: I0320 18:25:14.910228 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:15 crc kubenswrapper[4795]: I0320 18:25:15.960309 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s45dr" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" probeResult="failure" output=< Mar 20 18:25:15 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:25:15 crc kubenswrapper[4795]: > Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.253269 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.691922 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.692212 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.749765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:17 crc kubenswrapper[4795]: I0320 18:25:17.711332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc"} Mar 20 18:25:17 crc kubenswrapper[4795]: I0320 18:25:17.791271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:17 crc kubenswrapper[4795]: I0320 18:25:17.858368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:19 crc kubenswrapper[4795]: I0320 18:25:19.731006 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8s8nl" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" containerID="cri-o://bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" gracePeriod=2 Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.327649 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.403511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"3336a777-640c-4ec9-a1f7-27a05d6efe01\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.403673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"3336a777-640c-4ec9-a1f7-27a05d6efe01\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.403928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"3336a777-640c-4ec9-a1f7-27a05d6efe01\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.406086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities" (OuterVolumeSpecName: "utilities") pod "3336a777-640c-4ec9-a1f7-27a05d6efe01" (UID: "3336a777-640c-4ec9-a1f7-27a05d6efe01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.411073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf" (OuterVolumeSpecName: "kube-api-access-8qlzf") pod "3336a777-640c-4ec9-a1f7-27a05d6efe01" (UID: "3336a777-640c-4ec9-a1f7-27a05d6efe01"). InnerVolumeSpecName "kube-api-access-8qlzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.462033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3336a777-640c-4ec9-a1f7-27a05d6efe01" (UID: "3336a777-640c-4ec9-a1f7-27a05d6efe01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.506783 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.506825 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.506844 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742095 4795 generic.go:334] "Generic (PLEG): container finished" podID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" exitCode=0 Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8"} Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742404 4795 scope.go:117] "RemoveContainer" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"0988077d812aa21cebf551a9213e7cdabc69aef5852f2e0a86b4777259c6f40a"} Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.768460 4795 scope.go:117] "RemoveContainer" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.773321 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.781806 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.271141 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" path="/var/lib/kubelet/pods/3336a777-640c-4ec9-a1f7-27a05d6efe01/volumes" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.284211 4795 scope.go:117] "RemoveContainer" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.365951 4795 scope.go:117] "RemoveContainer" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" Mar 20 18:25:21 crc kubenswrapper[4795]: E0320 18:25:21.366573 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8\": container with ID starting with bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8 not found: ID does not exist" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.366722 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8"} err="failed to get container status \"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8\": rpc error: code = NotFound desc = could not find container \"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8\": container with ID starting with bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8 not found: ID does not exist" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.366758 4795 scope.go:117] "RemoveContainer" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" Mar 20 18:25:21 crc kubenswrapper[4795]: E0320 18:25:21.367128 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1\": container with ID starting with 3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1 not found: ID does not exist" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.367163 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1"} err="failed to get container status \"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1\": rpc error: code = NotFound desc = could not find container \"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1\": container with ID starting with 3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1 not found: ID does not exist" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.367184 4795 scope.go:117] "RemoveContainer" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" Mar 20 18:25:21 crc kubenswrapper[4795]: E0320 18:25:21.367423 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1\": container with ID starting with 847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1 not found: ID does not exist" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.367452 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1"} err="failed to get container status \"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1\": rpc error: code = NotFound desc = could not find container \"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1\": container with ID starting with 847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1 not found: ID does not exist" Mar 20 18:25:24 crc kubenswrapper[4795]: I0320 18:25:24.970553 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:25 crc kubenswrapper[4795]: I0320 18:25:25.031082 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:25 crc kubenswrapper[4795]: I0320 18:25:25.324862 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:26 crc kubenswrapper[4795]: I0320 18:25:26.808974 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s45dr" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" containerID="cri-o://9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" gracePeriod=2 Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.316768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.451865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"5534910c-2643-4a80-8d50-82267e2567e0\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.452010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"5534910c-2643-4a80-8d50-82267e2567e0\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.452893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"5534910c-2643-4a80-8d50-82267e2567e0\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.453733 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities" (OuterVolumeSpecName: "utilities") pod "5534910c-2643-4a80-8d50-82267e2567e0" (UID: "5534910c-2643-4a80-8d50-82267e2567e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.453824 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.458338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg" (OuterVolumeSpecName: "kube-api-access-97lxg") pod "5534910c-2643-4a80-8d50-82267e2567e0" (UID: "5534910c-2643-4a80-8d50-82267e2567e0"). InnerVolumeSpecName "kube-api-access-97lxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.555056 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.598232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5534910c-2643-4a80-8d50-82267e2567e0" (UID: "5534910c-2643-4a80-8d50-82267e2567e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.657191 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.819959 4795 generic.go:334] "Generic (PLEG): container finished" podID="5534910c-2643-4a80-8d50-82267e2567e0" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" exitCode=0 Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820060 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a"} Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82"} Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820523 4795 scope.go:117] "RemoveContainer" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.863508 4795 scope.go:117] "RemoveContainer" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.865351 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.873492 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.899645 4795 scope.go:117] "RemoveContainer" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.964228 4795 scope.go:117] "RemoveContainer" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" Mar 20 18:25:27 crc kubenswrapper[4795]: E0320 18:25:27.964941 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a\": container with ID starting with 9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a not found: ID does not exist" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.964980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a"} err="failed to get container status \"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a\": rpc error: code = NotFound desc = could not find container \"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a\": container with ID starting with 9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a not found: ID does not exist" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965005 4795 scope.go:117] "RemoveContainer" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" Mar 20 18:25:27 crc kubenswrapper[4795]: E0320 18:25:27.965314 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4\": container with ID starting with 1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4 not found: ID does not exist" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965338 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4"} err="failed to get container status \"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4\": rpc error: code = NotFound desc = could not find container \"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4\": container with ID starting with 1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4 not found: ID does not exist" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965352 4795 scope.go:117] "RemoveContainer" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" Mar 20 18:25:27 crc kubenswrapper[4795]: E0320 18:25:27.965571 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6\": container with ID starting with 9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6 not found: ID does not exist" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965594 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6"} err="failed to get container status \"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6\": rpc error: code = NotFound desc = could not find container \"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6\": container with ID starting with 9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6 not found: ID does not exist" Mar 20 18:25:29 crc kubenswrapper[4795]: I0320 18:25:29.274170 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5534910c-2643-4a80-8d50-82267e2567e0" path="/var/lib/kubelet/pods/5534910c-2643-4a80-8d50-82267e2567e0/volumes" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.164604 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166363 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166393 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166415 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166422 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166441 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166449 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166464 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166470 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166498 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166506 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166520 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166529 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166741 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166764 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.167571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.170180 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.170231 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.170436 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.186654 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.268922 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"auto-csr-approver-29567186-l88zk\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.372028 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"auto-csr-approver-29567186-l88zk\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.399459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"auto-csr-approver-29567186-l88zk\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.503938 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:01 crc kubenswrapper[4795]: I0320 18:26:01.011584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:26:01 crc kubenswrapper[4795]: I0320 18:26:01.210647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-l88zk" event={"ID":"654cb8e4-7fd7-4e3e-955a-a71906ccfb79","Type":"ContainerStarted","Data":"2dbb1b4ae1e3b69e21cb5fdf19f437788fb04da0d018b660857ad86909e5691e"} Mar 20 18:26:03 crc kubenswrapper[4795]: I0320 18:26:03.232330 4795 generic.go:334] "Generic (PLEG): container finished" podID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerID="43b386b54b2c5ae34c509074586a617552173f2543677683ef8d11caf140f2f9" exitCode=0 Mar 20 18:26:03 crc kubenswrapper[4795]: I0320 18:26:03.232435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-l88zk" event={"ID":"654cb8e4-7fd7-4e3e-955a-a71906ccfb79","Type":"ContainerDied","Data":"43b386b54b2c5ae34c509074586a617552173f2543677683ef8d11caf140f2f9"} Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.654162 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.754463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.762934 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr" (OuterVolumeSpecName: "kube-api-access-6ztgr") pod "654cb8e4-7fd7-4e3e-955a-a71906ccfb79" (UID: "654cb8e4-7fd7-4e3e-955a-a71906ccfb79"). InnerVolumeSpecName "kube-api-access-6ztgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.856782 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.263387 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.286987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-l88zk" event={"ID":"654cb8e4-7fd7-4e3e-955a-a71906ccfb79","Type":"ContainerDied","Data":"2dbb1b4ae1e3b69e21cb5fdf19f437788fb04da0d018b660857ad86909e5691e"} Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.287048 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbb1b4ae1e3b69e21cb5fdf19f437788fb04da0d018b660857ad86909e5691e" Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.719123 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.727470 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:26:07 crc kubenswrapper[4795]: I0320 18:26:07.260971 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" path="/var/lib/kubelet/pods/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec/volumes" Mar 20 18:26:11 crc kubenswrapper[4795]: I0320 18:26:11.719702 4795 scope.go:117] "RemoveContainer" containerID="fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf" Mar 20 18:26:53 crc kubenswrapper[4795]: I0320 18:26:53.768118 4795 generic.go:334] "Generic (PLEG): container finished" podID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerID="ca1e86805a9f6b3f6807f721075e3f792e3f51254780ac13719c7eec007f4373" exitCode=0 Mar 20 18:26:53 crc kubenswrapper[4795]: I0320 18:26:53.768238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerDied","Data":"ca1e86805a9f6b3f6807f721075e3f792e3f51254780ac13719c7eec007f4373"} Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.211393 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.321984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322310 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322343 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323042 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323078 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323683 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data" (OuterVolumeSpecName: "config-data") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.333259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.334073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.334270 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw" (OuterVolumeSpecName: "kube-api-access-w88sw") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "kube-api-access-w88sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.354578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.361633 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.374806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.385827 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425525 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425759 4795 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425849 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425930 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.426015 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.426108 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.429732 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.429859 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.430122 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.445070 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.531612 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.792075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerDied","Data":"7bdfe7f881d951a74ca8b66b0f91841dff449bc239ef1c2b7c679ee61596377d"} Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.792125 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bdfe7f881d951a74ca8b66b0f91841dff449bc239ef1c2b7c679ee61596377d" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.792249 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.030021 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:27:00 crc kubenswrapper[4795]: E0320 18:27:00.030876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.030890 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:27:00 crc kubenswrapper[4795]: E0320 18:27:00.030912 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerName="oc" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.030919 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerName="oc" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.031143 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerName="oc" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.031174 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.031797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.034346 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zgwjr" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.052849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.121184 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.121312 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54b6h\" (UniqueName: \"kubernetes.io/projected/3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0-kube-api-access-54b6h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.222968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54b6h\" (UniqueName: \"kubernetes.io/projected/3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0-kube-api-access-54b6h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.223134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.223662 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.245408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54b6h\" (UniqueName: \"kubernetes.io/projected/3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0-kube-api-access-54b6h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.259576 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.350113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.786018 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.837761 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0","Type":"ContainerStarted","Data":"02466a550805e88a2a340de106b3d33013005661fbbe664ff91ad333baba735b"} Mar 20 18:27:02 crc kubenswrapper[4795]: I0320 18:27:02.863750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0","Type":"ContainerStarted","Data":"e60aa9f2ea835d15178a6c165de56d214c6d87950df63a46486e65f878f732ad"} Mar 20 18:27:02 crc kubenswrapper[4795]: I0320 18:27:02.889853 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.216400545 podStartE2EDuration="2.889832532s" podCreationTimestamp="2026-03-20 18:27:00 +0000 UTC" firstStartedPulling="2026-03-20 18:27:00.799375512 +0000 UTC m=+4164.257407063" lastFinishedPulling="2026-03-20 18:27:02.472807509 +0000 UTC m=+4165.930839050" observedRunningTime="2026-03-20 18:27:02.881270046 +0000 UTC m=+4166.339301597" watchObservedRunningTime="2026-03-20 18:27:02.889832532 +0000 UTC m=+4166.347864083" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.831611 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.834200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.835555 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n2wfg"/"openshift-service-ca.crt" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.835946 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n2wfg"/"kube-root-ca.crt" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.836057 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n2wfg"/"default-dockercfg-khqtb" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.845269 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.956861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.956908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.058911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.058959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.059486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.078579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.151591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.605887 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:27:29 crc kubenswrapper[4795]: I0320 18:27:29.131059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerStarted","Data":"6b5dd8edc9985b61820c61f74a3a75491202da230357733de6439e66a17b3693"} Mar 20 18:27:34 crc kubenswrapper[4795]: I0320 18:27:34.181526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerStarted","Data":"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11"} Mar 20 18:27:34 crc kubenswrapper[4795]: I0320 18:27:34.182187 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerStarted","Data":"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553"} Mar 20 18:27:34 crc kubenswrapper[4795]: I0320 18:27:34.200184 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" podStartSLOduration=2.9078369840000002 podStartE2EDuration="7.200159292s" podCreationTimestamp="2026-03-20 18:27:27 +0000 UTC" firstStartedPulling="2026-03-20 18:27:28.873988848 +0000 UTC m=+4192.332020429" lastFinishedPulling="2026-03-20 18:27:33.166311206 +0000 UTC m=+4196.624342737" observedRunningTime="2026-03-20 18:27:34.197115027 +0000 UTC m=+4197.655146578" watchObservedRunningTime="2026-03-20 18:27:34.200159292 +0000 UTC m=+4197.658190873" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.742885 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-97766"] Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.745084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.759998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.760308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.861914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.862068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.862277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.888488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:38 crc kubenswrapper[4795]: I0320 18:27:38.066896 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:38 crc kubenswrapper[4795]: I0320 18:27:38.238521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-97766" event={"ID":"632805f2-aba6-41af-89dd-8b176af4ab77","Type":"ContainerStarted","Data":"c6ea7c0685ab870ea1880247614497aae537421fa8bfa8af0cdffd21865b427c"} Mar 20 18:27:41 crc kubenswrapper[4795]: I0320 18:27:41.300490 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:27:41 crc kubenswrapper[4795]: I0320 18:27:41.302655 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.291705 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.294919 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.300205 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.411234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.411581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.411656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514024 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.536246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.627635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:47 crc kubenswrapper[4795]: I0320 18:27:47.875589 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.340969 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" exitCode=0 Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.341063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4"} Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.341379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerStarted","Data":"4ccd8707afcd417a2b4fcfee27edadddc134eab632522b4d93bf84758023574e"} Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.345063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-97766" event={"ID":"632805f2-aba6-41af-89dd-8b176af4ab77","Type":"ContainerStarted","Data":"ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8"} Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.385675 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2wfg/crc-debug-97766" podStartSLOduration=2.033673474 podStartE2EDuration="11.385644383s" podCreationTimestamp="2026-03-20 18:27:37 +0000 UTC" firstStartedPulling="2026-03-20 18:27:38.123292811 +0000 UTC m=+4201.581324352" lastFinishedPulling="2026-03-20 18:27:47.47526372 +0000 UTC m=+4210.933295261" observedRunningTime="2026-03-20 18:27:48.380585967 +0000 UTC m=+4211.838617508" watchObservedRunningTime="2026-03-20 18:27:48.385644383 +0000 UTC m=+4211.843675924" Mar 20 18:27:49 crc kubenswrapper[4795]: I0320 18:27:49.370874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerStarted","Data":"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1"} Mar 20 18:27:50 crc kubenswrapper[4795]: I0320 18:27:50.378361 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" exitCode=0 Mar 20 18:27:50 crc kubenswrapper[4795]: I0320 18:27:50.378523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1"} Mar 20 18:27:51 crc kubenswrapper[4795]: I0320 18:27:51.392517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerStarted","Data":"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f"} Mar 20 18:27:51 crc kubenswrapper[4795]: I0320 18:27:51.423647 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bhjxn" podStartSLOduration=3.990576168 podStartE2EDuration="6.423625691s" podCreationTimestamp="2026-03-20 18:27:45 +0000 UTC" firstStartedPulling="2026-03-20 18:27:48.344392533 +0000 UTC m=+4211.802424074" lastFinishedPulling="2026-03-20 18:27:50.777442046 +0000 UTC m=+4214.235473597" observedRunningTime="2026-03-20 18:27:51.419538024 +0000 UTC m=+4214.877569585" watchObservedRunningTime="2026-03-20 18:27:51.423625691 +0000 UTC m=+4214.881657232" Mar 20 18:27:55 crc kubenswrapper[4795]: I0320 18:27:55.628709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:55 crc kubenswrapper[4795]: I0320 18:27:55.630191 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:55 crc kubenswrapper[4795]: I0320 18:27:55.675923 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:56 crc kubenswrapper[4795]: I0320 18:27:56.532376 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:56 crc kubenswrapper[4795]: I0320 18:27:56.589396 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:58 crc kubenswrapper[4795]: I0320 18:27:58.470206 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bhjxn" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" containerID="cri-o://5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" gracePeriod=2 Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.393551 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481121 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" exitCode=0 Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481160 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f"} Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481174 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481185 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"4ccd8707afcd417a2b4fcfee27edadddc134eab632522b4d93bf84758023574e"} Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481202 4795 scope.go:117] "RemoveContainer" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.483759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.483840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.483935 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.484607 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities" (OuterVolumeSpecName: "utilities") pod "c9c595cc-573a-4c5d-95f9-48d3e0289c6f" (UID: "c9c595cc-573a-4c5d-95f9-48d3e0289c6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.531431 4795 scope.go:117] "RemoveContainer" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.560727 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9c595cc-573a-4c5d-95f9-48d3e0289c6f" (UID: "c9c595cc-573a-4c5d-95f9-48d3e0289c6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.565989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt" (OuterVolumeSpecName: "kube-api-access-v66dt") pod "c9c595cc-573a-4c5d-95f9-48d3e0289c6f" (UID: "c9c595cc-573a-4c5d-95f9-48d3e0289c6f"). InnerVolumeSpecName "kube-api-access-v66dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.580820 4795 scope.go:117] "RemoveContainer" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.586126 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.586157 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.586169 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.598287 4795 scope.go:117] "RemoveContainer" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" Mar 20 18:27:59 crc kubenswrapper[4795]: E0320 18:27:59.600408 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f\": container with ID starting with 5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f not found: ID does not exist" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.600505 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f"} err="failed to get container status \"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f\": rpc error: code = NotFound desc = could not find container \"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f\": container with ID starting with 5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f not found: ID does not exist" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.600565 4795 scope.go:117] "RemoveContainer" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" Mar 20 18:27:59 crc kubenswrapper[4795]: E0320 18:27:59.601045 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1\": container with ID starting with 9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1 not found: ID does not exist" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.601083 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1"} err="failed to get container status \"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1\": rpc error: code = NotFound desc = could not find container \"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1\": container with ID starting with 9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1 not found: ID does not exist" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.601104 4795 scope.go:117] "RemoveContainer" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" Mar 20 18:27:59 crc kubenswrapper[4795]: E0320 18:27:59.603534 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4\": container with ID starting with 1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4 not found: ID does not exist" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.603573 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4"} err="failed to get container status \"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4\": rpc error: code = NotFound desc = could not find container \"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4\": container with ID starting with 1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4 not found: ID does not exist" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.817066 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.825702 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.151627 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:28:00 crc kubenswrapper[4795]: E0320 18:28:00.152364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-utilities" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152382 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-utilities" Mar 20 18:28:00 crc kubenswrapper[4795]: E0320 18:28:00.152397 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-content" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-content" Mar 20 18:28:00 crc kubenswrapper[4795]: E0320 18:28:00.152413 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152420 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152593 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.153196 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.156428 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.156435 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.157483 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.165413 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.301821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"auto-csr-approver-29567188-97rr4\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.405575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"auto-csr-approver-29567188-97rr4\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.448646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"auto-csr-approver-29567188-97rr4\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.476522 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.938496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:28:01 crc kubenswrapper[4795]: I0320 18:28:01.262086 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" path="/var/lib/kubelet/pods/c9c595cc-573a-4c5d-95f9-48d3e0289c6f/volumes" Mar 20 18:28:01 crc kubenswrapper[4795]: I0320 18:28:01.518137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-97rr4" event={"ID":"551830bd-5613-42fb-b4ad-b1c6c6a0b09c","Type":"ContainerStarted","Data":"68f9e7c7ac5591cc6f6011c1b081e3f974b165cad8cd260e55cb3eea35ddf0b9"} Mar 20 18:28:03 crc kubenswrapper[4795]: I0320 18:28:03.535270 4795 generic.go:334] "Generic (PLEG): container finished" podID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerID="29253cb593d65e36df8393e9b2e7d2df325902972a57f35ad5e0d8767eaa777e" exitCode=0 Mar 20 18:28:03 crc kubenswrapper[4795]: I0320 18:28:03.535731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-97rr4" event={"ID":"551830bd-5613-42fb-b4ad-b1c6c6a0b09c","Type":"ContainerDied","Data":"29253cb593d65e36df8393e9b2e7d2df325902972a57f35ad5e0d8767eaa777e"} Mar 20 18:28:05 crc kubenswrapper[4795]: I0320 18:28:05.802856 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:05 crc kubenswrapper[4795]: I0320 18:28:05.904662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " Mar 20 18:28:05 crc kubenswrapper[4795]: I0320 18:28:05.910971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v" (OuterVolumeSpecName: "kube-api-access-lct6v") pod "551830bd-5613-42fb-b4ad-b1c6c6a0b09c" (UID: "551830bd-5613-42fb-b4ad-b1c6c6a0b09c"). InnerVolumeSpecName "kube-api-access-lct6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.006977 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.566245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-97rr4" event={"ID":"551830bd-5613-42fb-b4ad-b1c6c6a0b09c","Type":"ContainerDied","Data":"68f9e7c7ac5591cc6f6011c1b081e3f974b165cad8cd260e55cb3eea35ddf0b9"} Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.566286 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f9e7c7ac5591cc6f6011c1b081e3f974b165cad8cd260e55cb3eea35ddf0b9" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.566362 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.876865 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.884670 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:28:07 crc kubenswrapper[4795]: I0320 18:28:07.263734 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478a3729-f417-4458-b0c5-562ed9c72252" path="/var/lib/kubelet/pods/478a3729-f417-4458-b0c5-562ed9c72252/volumes" Mar 20 18:28:11 crc kubenswrapper[4795]: I0320 18:28:11.299938 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:28:11 crc kubenswrapper[4795]: I0320 18:28:11.300494 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:28:11 crc kubenswrapper[4795]: I0320 18:28:11.899190 4795 scope.go:117] "RemoveContainer" containerID="693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b" Mar 20 18:28:34 crc kubenswrapper[4795]: I0320 18:28:34.822223 4795 generic.go:334] "Generic (PLEG): container finished" podID="632805f2-aba6-41af-89dd-8b176af4ab77" containerID="ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8" exitCode=0 Mar 20 18:28:34 crc kubenswrapper[4795]: I0320 18:28:34.822302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-97766" event={"ID":"632805f2-aba6-41af-89dd-8b176af4ab77","Type":"ContainerDied","Data":"ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8"} Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.942285 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.976044 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-97766"] Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.986363 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-97766"] Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989266 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"632805f2-aba6-41af-89dd-8b176af4ab77\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"632805f2-aba6-41af-89dd-8b176af4ab77\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989417 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host" (OuterVolumeSpecName: "host") pod "632805f2-aba6-41af-89dd-8b176af4ab77" (UID: "632805f2-aba6-41af-89dd-8b176af4ab77"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989851 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.995957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b" (OuterVolumeSpecName: "kube-api-access-9fk5b") pod "632805f2-aba6-41af-89dd-8b176af4ab77" (UID: "632805f2-aba6-41af-89dd-8b176af4ab77"). InnerVolumeSpecName "kube-api-access-9fk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:36 crc kubenswrapper[4795]: I0320 18:28:36.092114 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:36 crc kubenswrapper[4795]: I0320 18:28:36.842419 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6ea7c0685ab870ea1880247614497aae537421fa8bfa8af0cdffd21865b427c" Mar 20 18:28:36 crc kubenswrapper[4795]: I0320 18:28:36.842464 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.163629 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-zjcbs"] Mar 20 18:28:37 crc kubenswrapper[4795]: E0320 18:28:37.164100 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" containerName="container-00" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164114 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" containerName="container-00" Mar 20 18:28:37 crc kubenswrapper[4795]: E0320 18:28:37.164161 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerName="oc" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164170 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerName="oc" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164349 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" containerName="container-00" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164364 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerName="oc" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.210585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.210744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.262879 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" path="/var/lib/kubelet/pods/632805f2-aba6-41af-89dd-8b176af4ab77/volumes" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.312286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.312448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.312877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.366117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.483811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.851354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" event={"ID":"6a07434f-3d3f-4d03-b2e8-007a9df2f23c","Type":"ContainerStarted","Data":"740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3"} Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.851621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" event={"ID":"6a07434f-3d3f-4d03-b2e8-007a9df2f23c","Type":"ContainerStarted","Data":"2afcdeda4ec6392500e172a85f0874653e75d770ea9801b8d4a0b73ca4a4256e"} Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.865941 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" podStartSLOduration=0.865924309 podStartE2EDuration="865.924309ms" podCreationTimestamp="2026-03-20 18:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:28:37.863191754 +0000 UTC m=+4261.321223285" watchObservedRunningTime="2026-03-20 18:28:37.865924309 +0000 UTC m=+4261.323955850" Mar 20 18:28:38 crc kubenswrapper[4795]: I0320 18:28:38.859401 4795 generic.go:334] "Generic (PLEG): container finished" podID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerID="740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3" exitCode=0 Mar 20 18:28:38 crc kubenswrapper[4795]: I0320 18:28:38.859445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" event={"ID":"6a07434f-3d3f-4d03-b2e8-007a9df2f23c","Type":"ContainerDied","Data":"740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3"} Mar 20 18:28:39 crc kubenswrapper[4795]: I0320 18:28:39.990451 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.087172 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-zjcbs"] Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.098245 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-zjcbs"] Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.152335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.152449 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host" (OuterVolumeSpecName: "host") pod "6a07434f-3d3f-4d03-b2e8-007a9df2f23c" (UID: "6a07434f-3d3f-4d03-b2e8-007a9df2f23c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.152570 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.153173 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.163942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b" (OuterVolumeSpecName: "kube-api-access-jxr7b") pod "6a07434f-3d3f-4d03-b2e8-007a9df2f23c" (UID: "6a07434f-3d3f-4d03-b2e8-007a9df2f23c"). InnerVolumeSpecName "kube-api-access-jxr7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.254459 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.891344 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2afcdeda4ec6392500e172a85f0874653e75d770ea9801b8d4a0b73ca4a4256e" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.891604 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.263727 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" path="/var/lib/kubelet/pods/6a07434f-3d3f-4d03-b2e8-007a9df2f23c/volumes" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.299897 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.299981 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.300049 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.301190 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.301294 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc" gracePeriod=600 Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.318951 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-jrdxp"] Mar 20 18:28:41 crc kubenswrapper[4795]: E0320 18:28:41.319459 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerName="container-00" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.319544 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerName="container-00" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.319831 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerName="container-00" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.320766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.509775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.509855 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.611669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.612032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.612172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.652433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902598 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc" exitCode=0 Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc"} Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761"} Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902743 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.947352 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: W0320 18:28:41.984398 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ed4b12_46ff_46fd_b451_308fec6fda3d.slice/crio-f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4 WatchSource:0}: Error finding container f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4: Status 404 returned error can't find the container with id f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4 Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.911086 4795 generic.go:334] "Generic (PLEG): container finished" podID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerID="cd9a82179132e7ce391f9016c2a0e0e1a65591e62d7440ada8662bd6a235be65" exitCode=0 Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.911156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" event={"ID":"44ed4b12-46ff-46fd-b451-308fec6fda3d","Type":"ContainerDied","Data":"cd9a82179132e7ce391f9016c2a0e0e1a65591e62d7440ada8662bd6a235be65"} Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.911591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" event={"ID":"44ed4b12-46ff-46fd-b451-308fec6fda3d","Type":"ContainerStarted","Data":"f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4"} Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.974114 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-jrdxp"] Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.985506 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-jrdxp"] Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.038997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.155814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"44ed4b12-46ff-46fd-b451-308fec6fda3d\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.155988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"44ed4b12-46ff-46fd-b451-308fec6fda3d\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.156154 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host" (OuterVolumeSpecName: "host") pod "44ed4b12-46ff-46fd-b451-308fec6fda3d" (UID: "44ed4b12-46ff-46fd-b451-308fec6fda3d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.156563 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.165376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g" (OuterVolumeSpecName: "kube-api-access-sg48g") pod "44ed4b12-46ff-46fd-b451-308fec6fda3d" (UID: "44ed4b12-46ff-46fd-b451-308fec6fda3d"). InnerVolumeSpecName "kube-api-access-sg48g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.259420 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.956335 4795 scope.go:117] "RemoveContainer" containerID="cd9a82179132e7ce391f9016c2a0e0e1a65591e62d7440ada8662bd6a235be65" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.956532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:45 crc kubenswrapper[4795]: I0320 18:28:45.267241 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" path="/var/lib/kubelet/pods/44ed4b12-46ff-46fd-b451-308fec6fda3d/volumes" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.634760 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api/0.log" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.722794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api-log/0.log" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.883255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener/0.log" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.900563 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener-log/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.006197 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.074540 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker-log/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.348299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-central-agent/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.446673 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-notification-agent/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.471445 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/proxy-httpd/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.524883 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-65dps_0708214e-e711-465a-a54e-97a462b2777e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.539113 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/sg-core/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.711650 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.737718 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api-log/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.033669 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/cinder-scheduler/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.076401 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/probe/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.391119 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hprm9_2bad20c9-d77a-4c30-8fa2-979c05697cf4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.581511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.713777 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm_3d666090-1065-4b2d-9ac6-b84776b53d0a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.790759 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.970874 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/dnsmasq-dns/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.100323 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k_b0af5324-4ba3-4a12-9fdb-b467918ba19d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.110799 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-httpd/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.197531 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-log/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.351438 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-httpd/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.412607 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-log/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.659879 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.814290 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5_0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.023004 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon-log/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.378564 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-t26vc_cdfe5ffc-ab15-4277-966f-f506e725e8b1/keystone-cron/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.659727 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_72605c7d-99df-450f-900b-3022b0520149/kube-state-metrics/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.881706 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5rj55_20b330a0-830c-419e-81fe-a36dd1a32cc2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.983854 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85b996ff68-fdzxg_7b20a034-11f6-40ad-9447-32c49f705c07/keystone-api/0.log" Mar 20 18:29:17 crc kubenswrapper[4795]: I0320 18:29:17.713177 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-httpd/0.log" Mar 20 18:29:17 crc kubenswrapper[4795]: I0320 18:29:17.837281 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-api/0.log" Mar 20 18:29:17 crc kubenswrapper[4795]: I0320 18:29:17.937068 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7_e29f4857-ff0d-4806-ba09-74448200e8e2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.151385 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rp77q_b6da9d2a-e18f-4994-b8f3-6b1eb969564b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.235449 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-log/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.670423 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5916e4d2-2863-4088-be97-cf368906820b/nova-cell0-conductor-conductor/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.715615 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-api/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.727084 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_19c15c93-572c-4d53-b924-172f3ad29c8a/nova-cell1-conductor-conductor/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.795451 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:19 crc kubenswrapper[4795]: E0320 18:29:19.795954 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerName="container-00" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.795975 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerName="container-00" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.796513 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerName="container-00" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.797946 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.849944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.850054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.850113 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.869722 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.951262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.951328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.951372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.952041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.952057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.014501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d2a5e398-6d25-43b1-8c29-407af2d9348b/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.507093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.765406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.911483 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-log/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.225800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.317240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerStarted","Data":"f48d14fe770b6f14c5b8978e7e896e405febd9356d726a808ae98af698bf5a4d"} Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.444931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.527893 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-metadata/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.691650 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.698141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c23f56ff-eceb-4891-87e5-57ebeb7eba8d/nova-scheduler-scheduler/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.735829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/galera/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.924947 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.999170 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-kc4wx_709f5080-c511-4d3b-bc9c-baeec85fa245/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.080234 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.163369 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/galera/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.213922 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_cf3f8aea-393e-418a-ad14-2848c8df93e9/openstackclient/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.333276 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" exitCode=0 Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.333318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1"} Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.337346 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.411613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dnp2g_28df10bb-d6a9-47a9-9b79-0bb9665529ef/ovn-controller/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.420781 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n4gzx_85004117-20bc-474e-88f5-ce49032749ff/openstack-network-exporter/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.660177 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.804315 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.852023 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovs-vswitchd/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.853566 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.056892 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/openstack-network-exporter/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.164923 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/ovn-northd/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.320220 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/openstack-network-exporter/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.366467 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9jw45_6c737290-0616-475b-a839-cca387d8d90d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.375530 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/ovsdbserver-nb/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.557409 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/openstack-network-exporter/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.560966 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/ovsdbserver-sb/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.903126 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.053602 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-api/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.086103 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-log/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.089613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.210028 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/rabbitmq/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.326892 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.353586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerStarted","Data":"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20"} Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.483966 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.537744 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/rabbitmq/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.583293 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88_1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.804598 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tx6d9_d7dc5d37-6d24-48ea-acc1-2b4ed3de6936/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.845832 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk_e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.087231 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j6rls_80cf5a83-936d-4789-a7bc-b91cdb0e564d/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.112726 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-prfq6_9cdb4943-60a1-41cc-aead-1702a4c1f68a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.140166 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ca95ec62-fce9-4c91-bb59-fa80f512edba/memcached/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.302540 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.360164 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-httpd/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.364125 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" exitCode=0 Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.364174 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20"} Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.373438 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m8zw5_2c422574-0103-4c97-9e23-5a78c5b44e69/swift-ring-rebalance/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.529733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-auditor/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.571803 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-reaper/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.609394 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.639335 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-replicator/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.698898 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-auditor/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.749168 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-replicator/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.783059 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.797243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-updater/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.856829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-auditor/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.941483 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-expirer/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.960810 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-replicator/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.975374 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.986617 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-updater/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.074138 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/rsync/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.115304 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/swift-recon-cron/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.299613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_caaf60a5-8c45-4831-8d26-8cf808f1da7a/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.372940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerStarted","Data":"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000"} Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.397711 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27vc2" podStartSLOduration=3.742151994 podStartE2EDuration="7.397677217s" podCreationTimestamp="2026-03-20 18:29:19 +0000 UTC" firstStartedPulling="2026-03-20 18:29:22.337121263 +0000 UTC m=+4305.795152804" lastFinishedPulling="2026-03-20 18:29:25.992646486 +0000 UTC m=+4309.450678027" observedRunningTime="2026-03-20 18:29:26.389836754 +0000 UTC m=+4309.847868295" watchObservedRunningTime="2026-03-20 18:29:26.397677217 +0000 UTC m=+4309.855708758" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.462136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0/test-operator-logs-container/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.617375 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5_35b4aa82-d668-474b-b54d-b540190f5a6c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.864168 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh_d519d04c-89f1-46b7-8136-1a9596af73ac/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:30 crc kubenswrapper[4795]: I0320 18:29:30.766127 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:30 crc kubenswrapper[4795]: I0320 18:29:30.766642 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:30 crc kubenswrapper[4795]: I0320 18:29:30.814997 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:31 crc kubenswrapper[4795]: I0320 18:29:31.515325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:31 crc kubenswrapper[4795]: I0320 18:29:31.570368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:33 crc kubenswrapper[4795]: I0320 18:29:33.448460 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27vc2" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" containerID="cri-o://57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" gracePeriod=2 Mar 20 18:29:33 crc kubenswrapper[4795]: I0320 18:29:33.943453 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.035357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"5d33b616-99f6-473a-8114-0203d0f7e9fb\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.035413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"5d33b616-99f6-473a-8114-0203d0f7e9fb\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.035436 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"5d33b616-99f6-473a-8114-0203d0f7e9fb\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.036603 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities" (OuterVolumeSpecName: "utilities") pod "5d33b616-99f6-473a-8114-0203d0f7e9fb" (UID: "5d33b616-99f6-473a-8114-0203d0f7e9fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.041108 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7" (OuterVolumeSpecName: "kube-api-access-gzzv7") pod "5d33b616-99f6-473a-8114-0203d0f7e9fb" (UID: "5d33b616-99f6-473a-8114-0203d0f7e9fb"). InnerVolumeSpecName "kube-api-access-gzzv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.063548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d33b616-99f6-473a-8114-0203d0f7e9fb" (UID: "5d33b616-99f6-473a-8114-0203d0f7e9fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.137673 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.137731 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") on node \"crc\" DevicePath \"\"" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.137744 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.459783 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" exitCode=0 Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.459834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.459853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000"} Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.461071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"f48d14fe770b6f14c5b8978e7e896e405febd9356d726a808ae98af698bf5a4d"} Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.461101 4795 scope.go:117] "RemoveContainer" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.481489 4795 scope.go:117] "RemoveContainer" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.500815 4795 scope.go:117] "RemoveContainer" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.505011 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.513532 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.551415 4795 scope.go:117] "RemoveContainer" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" Mar 20 18:29:34 crc kubenswrapper[4795]: E0320 18:29:34.551945 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000\": container with ID starting with 57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000 not found: ID does not exist" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.551989 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000"} err="failed to get container status \"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000\": rpc error: code = NotFound desc = could not find container \"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000\": container with ID starting with 57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000 not found: ID does not exist" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552017 4795 scope.go:117] "RemoveContainer" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" Mar 20 18:29:34 crc kubenswrapper[4795]: E0320 18:29:34.552459 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20\": container with ID starting with 8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20 not found: ID does not exist" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552488 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20"} err="failed to get container status \"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20\": rpc error: code = NotFound desc = could not find container \"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20\": container with ID starting with 8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20 not found: ID does not exist" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552508 4795 scope.go:117] "RemoveContainer" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" Mar 20 18:29:34 crc kubenswrapper[4795]: E0320 18:29:34.552754 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1\": container with ID starting with 5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1 not found: ID does not exist" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552793 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1"} err="failed to get container status \"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1\": rpc error: code = NotFound desc = could not find container \"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1\": container with ID starting with 5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1 not found: ID does not exist" Mar 20 18:29:35 crc kubenswrapper[4795]: I0320 18:29:35.261265 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" path="/var/lib/kubelet/pods/5d33b616-99f6-473a-8114-0203d0f7e9fb/volumes" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.351945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.534145 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.545764 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.591137 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.764663 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.767177 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/extract/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.773317 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.064141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-5hzvs_afefdb79-bad6-4deb-904b-515174cca414/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.190544 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-jgs27_43804d6b-2358-46fd-bf04-26b2308f8ab0/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.366886 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-dwx6n_a957ef3d-357c-4aa4-865c-533f889257d7/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.527287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rmcrf_4cdd16c5-b7d3-4c52-a286-f3555daf43d9/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.678987 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-f74p9_ded84ba8-d70a-4379-bc80-d142e5306cc7/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.993744 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-55vp5_9cba9cd3-4144-4262-82a2-f2330793aae6/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.259659 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-6hsxn_84901a7b-ddbf-47d9-954f-c167cd9cd46c/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.403733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6f8b7f6fdf-lrjfh_fc0f2e63-50dd-424e-af01-3d09c9edd5b3/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.482086 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-trjt4_7a887d91-fa86-45d2-a6be-aa7326f7d544/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.662486 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-jfdzb_071f0af8-4164-4f95-b0ee-720e3b3097f3/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.802213 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-bqzcz_0ffe016b-8919-4b8f-839c-669637b7accc/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.821545 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-h9f9t_21481bba-04ec-47ce-95d0-fe27787a3d62/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.958958 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-5v5sg_0da03e08-561c-4b5f-89c7-af80c8f39f54/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.068913 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-n7cl7_d4ff6977-1303-4267-983e-3e99935f2aae/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.158433 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f557zsq_a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.348401 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b67cc5c9-vm29j_084071f5-e58b-451b-9cf5-67203ae1ba02/operator/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.583128 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b6ckg_3aeffd27-d2c7-4744-8e01-07a4db74597e/registry-server/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.779277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-dtfmz_84a19583-b173-4fb9-8b83-d9c41a5faf79/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.914780 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-6cw7v_b47e6216-2e29-4d58-8b0c-5970aee6307b/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.140011 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-828jr_750d9405-0514-4876-821e-9ab1f6871e87/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.258031 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-jbwss_46248665-6f9f-46e0-8db7-6be8c47cf521/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.396448 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-rv5df_e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.533470 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-6z7j5_933bcfd5-f2d1-404f-876d-1d3da597f415/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.630768 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56f44579c8-px2ft_0d8b26db-957e-4c0e-bb22-42f12d5beb0b/manager/0.log" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.147229 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:30:00 crc kubenswrapper[4795]: E0320 18:30:00.148211 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-utilities" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148230 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-utilities" Mar 20 18:30:00 crc kubenswrapper[4795]: E0320 18:30:00.148264 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148272 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4795]: E0320 18:30:00.148292 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-content" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148299 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-content" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148517 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.149318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.151236 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.151406 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.151461 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.154894 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g"] Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.155969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.158362 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.158564 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.171755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.205682 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g"] Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.253398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.254214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.254252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"auto-csr-approver-29567190-fmpq8\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.254296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356412 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"auto-csr-approver-29567190-fmpq8\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.357374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.366471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.967276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"auto-csr-approver-29567190-fmpq8\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.967587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.073315 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.107805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.629194 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:30:01 crc kubenswrapper[4795]: W0320 18:30:01.635299 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a6542a_c4e9_4747_89bd_b15f37e98854.slice/crio-e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64 WatchSource:0}: Error finding container e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64: Status 404 returned error can't find the container with id e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64 Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.639338 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g"] Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.722091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" event={"ID":"f4a6542a-c4e9-4747-89bd-b15f37e98854","Type":"ContainerStarted","Data":"e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64"} Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.722918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" event={"ID":"a7e72f01-1ab6-47a2-99d2-ff2778039c34","Type":"ContainerStarted","Data":"da4b300832cbaf79cd4467333ae6ea3634f2faeac48948a563d0a49de3b8854a"} Mar 20 18:30:02 crc kubenswrapper[4795]: I0320 18:30:02.731352 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerID="c6eef357aa6aa565680007e12db67df115ab8783e62776b7bb3c464aafd1537a" exitCode=0 Mar 20 18:30:02 crc kubenswrapper[4795]: I0320 18:30:02.731404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" event={"ID":"f4a6542a-c4e9-4747-89bd-b15f37e98854","Type":"ContainerDied","Data":"c6eef357aa6aa565680007e12db67df115ab8783e62776b7bb3c464aafd1537a"} Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.076537 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.134947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"f4a6542a-c4e9-4747-89bd-b15f37e98854\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.135501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"f4a6542a-c4e9-4747-89bd-b15f37e98854\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.135581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"f4a6542a-c4e9-4747-89bd-b15f37e98854\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.136223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume" (OuterVolumeSpecName: "config-volume") pod "f4a6542a-c4e9-4747-89bd-b15f37e98854" (UID: "f4a6542a-c4e9-4747-89bd-b15f37e98854"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.143313 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f4a6542a-c4e9-4747-89bd-b15f37e98854" (UID: "f4a6542a-c4e9-4747-89bd-b15f37e98854"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.143785 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm" (OuterVolumeSpecName: "kube-api-access-4gwtm") pod "f4a6542a-c4e9-4747-89bd-b15f37e98854" (UID: "f4a6542a-c4e9-4747-89bd-b15f37e98854"). InnerVolumeSpecName "kube-api-access-4gwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.237362 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.237392 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.237401 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.755784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" event={"ID":"f4a6542a-c4e9-4747-89bd-b15f37e98854","Type":"ContainerDied","Data":"e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64"} Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.755813 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.755825 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64" Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.152415 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.167928 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.262114 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" path="/var/lib/kubelet/pods/cd60241d-b207-4a9a-86b6-3be32ab282d3/volumes" Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.773943 4795 generic.go:334] "Generic (PLEG): container finished" podID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerID="e6a437e3ef5671482fc87ddf7b0443a4a6151e38d08d0d94800ebfc859f95be2" exitCode=0 Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.774002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" event={"ID":"a7e72f01-1ab6-47a2-99d2-ff2778039c34","Type":"ContainerDied","Data":"e6a437e3ef5671482fc87ddf7b0443a4a6151e38d08d0d94800ebfc859f95be2"} Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.128501 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.288803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.300458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw" (OuterVolumeSpecName: "kube-api-access-hs6fw") pod "a7e72f01-1ab6-47a2-99d2-ff2778039c34" (UID: "a7e72f01-1ab6-47a2-99d2-ff2778039c34"). InnerVolumeSpecName "kube-api-access-hs6fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.391432 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.794195 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" event={"ID":"a7e72f01-1ab6-47a2-99d2-ff2778039c34","Type":"ContainerDied","Data":"da4b300832cbaf79cd4467333ae6ea3634f2faeac48948a563d0a49de3b8854a"} Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.794238 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4b300832cbaf79cd4467333ae6ea3634f2faeac48948a563d0a49de3b8854a" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.794251 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:08 crc kubenswrapper[4795]: I0320 18:30:08.190502 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:30:08 crc kubenswrapper[4795]: I0320 18:30:08.201951 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:30:09 crc kubenswrapper[4795]: I0320 18:30:09.266553 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" path="/var/lib/kubelet/pods/133e7bff-461c-4450-bf3b-8d43791045a4/volumes" Mar 20 18:30:12 crc kubenswrapper[4795]: I0320 18:30:12.039312 4795 scope.go:117] "RemoveContainer" containerID="ccece14cbf4b4c8c9889d9ebad0a41bd1c87c88349be10c251ccd8a08eb4cac4" Mar 20 18:30:12 crc kubenswrapper[4795]: I0320 18:30:12.069215 4795 scope.go:117] "RemoveContainer" containerID="7e78ac608afa56e8111695b336413ee802aca06929422f0042e8a413df5d1f4a" Mar 20 18:30:17 crc kubenswrapper[4795]: I0320 18:30:17.147080 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-knn77_cd9b8a97-1b9d-4365-a985-a02d4078e3c2/control-plane-machine-set-operator/0.log" Mar 20 18:30:17 crc kubenswrapper[4795]: I0320 18:30:17.330086 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/kube-rbac-proxy/0.log" Mar 20 18:30:17 crc kubenswrapper[4795]: I0320 18:30:17.382288 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/machine-api-operator/0.log" Mar 20 18:30:32 crc kubenswrapper[4795]: I0320 18:30:32.115182 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lqmsr_5231a25a-8bda-4f72-8a81-e5a49cdc31eb/cert-manager-controller/0.log" Mar 20 18:30:32 crc kubenswrapper[4795]: I0320 18:30:32.218926 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-smr2n_7df834a3-0298-4cc9-8b4e-49ce3f51183e/cert-manager-cainjector/0.log" Mar 20 18:30:32 crc kubenswrapper[4795]: I0320 18:30:32.323304 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cff8c_88832f68-9f72-4321-8d3f-bb3e23465fdb/cert-manager-webhook/0.log" Mar 20 18:30:41 crc kubenswrapper[4795]: I0320 18:30:41.299949 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:30:41 crc kubenswrapper[4795]: I0320 18:30:41.300604 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:30:46 crc kubenswrapper[4795]: I0320 18:30:46.906286 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-5jfjl_d34761db-41bf-4e5f-bdca-8c25e281c924/nmstate-console-plugin/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.141062 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bsp49_e070281f-65f5-4c6d-b012-06c027393646/nmstate-handler/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.163825 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/kube-rbac-proxy/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.193606 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/nmstate-metrics/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.392720 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mjhsq_f50011ef-d180-4d84-ba10-a2da522a579d/nmstate-webhook/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.452588 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-dlcps_efca4120-31ef-4c52-a6da-59b33144a979/nmstate-operator/0.log" Mar 20 18:31:11 crc kubenswrapper[4795]: I0320 18:31:11.299839 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:31:11 crc kubenswrapper[4795]: I0320 18:31:11.300657 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.461336 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/kube-rbac-proxy/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.502625 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/controller/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.545174 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.672386 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.699370 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.699445 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.748236 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.980738 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.010173 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.010535 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.033985 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.762060 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.774328 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/controller/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.779469 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.787832 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.957150 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy-frr/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.980317 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr-metrics/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.015845 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.179836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/reloader/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.240299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jw5dk_377dbbb7-0571-40cd-9fe3-3c86fbf4f092/frr-k8s-webhook-server/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.503176 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7547f4d8c8-499mj_0e8dba8d-8387-4ced-ac54-b8d5e1cf3650/manager/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.755388 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5ffc48dc7-t9vwn_2d29ac93-da31-4834-a858-d5bd9adb28d1/webhook-server/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.761461 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/kube-rbac-proxy/0.log" Mar 20 18:31:22 crc kubenswrapper[4795]: I0320 18:31:22.415537 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/speaker/0.log" Mar 20 18:31:22 crc kubenswrapper[4795]: I0320 18:31:22.523395 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.426768 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.661478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.704790 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.718543 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.900336 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.914863 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/extract/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.943950 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.078584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.260287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.262586 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.267292 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.444413 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.449528 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.468276 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/extract/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.592424 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.803044 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.811495 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.837853 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.980099 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.999570 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.214907 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.411898 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/registry-server/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.428805 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.437274 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.455945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.619867 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.620049 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.928867 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8x76m_a2de2777-57e1-4310-a878-1cfc1fc77e44/marketplace-operator/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.950496 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.135323 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/registry-server/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.186733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.861417 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.891378 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.052788 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.059331 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.230849 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/registry-server/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.288550 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.425485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.457856 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.458972 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.618416 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.645981 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.125308 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/registry-server/0.log" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.299959 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.300018 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.300069 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.301043 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.301120 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" gracePeriod=600 Mar 20 18:31:41 crc kubenswrapper[4795]: E0320 18:31:41.531379 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688010 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" exitCode=0 Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761"} Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688117 4795 scope.go:117] "RemoveContainer" containerID="6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688867 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:31:41 crc kubenswrapper[4795]: E0320 18:31:41.689163 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:31:53 crc kubenswrapper[4795]: I0320 18:31:53.254968 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:31:53 crc kubenswrapper[4795]: E0320 18:31:53.256123 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.142512 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:32:00 crc kubenswrapper[4795]: E0320 18:32:00.143343 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143356 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4795]: E0320 18:32:00.143393 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143400 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143572 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143582 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.144157 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.151753 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.152581 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.152646 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.152821 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.270118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"auto-csr-approver-29567192-4p2jk\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.371940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"auto-csr-approver-29567192-4p2jk\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.411429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"auto-csr-approver-29567192-4p2jk\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.466878 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.977249 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:32:01 crc kubenswrapper[4795]: I0320 18:32:01.877063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" event={"ID":"6cec1e8e-999e-44e2-a9b5-387a10c5de11","Type":"ContainerStarted","Data":"d03146044402c32a68ed0339fdaeb2a74f4f65f66e1f308a5a0fa66c04fc2f6f"} Mar 20 18:32:02 crc kubenswrapper[4795]: I0320 18:32:02.888497 4795 generic.go:334] "Generic (PLEG): container finished" podID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerID="aef631dab3c0134ed5918f9313f1d6153c072f4d69296d8ee64df68955ab56a0" exitCode=0 Mar 20 18:32:02 crc kubenswrapper[4795]: I0320 18:32:02.888630 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" event={"ID":"6cec1e8e-999e-44e2-a9b5-387a10c5de11","Type":"ContainerDied","Data":"aef631dab3c0134ed5918f9313f1d6153c072f4d69296d8ee64df68955ab56a0"} Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.276094 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.449774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.457666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h" (OuterVolumeSpecName: "kube-api-access-j2x4h") pod "6cec1e8e-999e-44e2-a9b5-387a10c5de11" (UID: "6cec1e8e-999e-44e2-a9b5-387a10c5de11"). InnerVolumeSpecName "kube-api-access-j2x4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.551628 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") on node \"crc\" DevicePath \"\"" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.907913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" event={"ID":"6cec1e8e-999e-44e2-a9b5-387a10c5de11","Type":"ContainerDied","Data":"d03146044402c32a68ed0339fdaeb2a74f4f65f66e1f308a5a0fa66c04fc2f6f"} Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.907977 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.908041 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03146044402c32a68ed0339fdaeb2a74f4f65f66e1f308a5a0fa66c04fc2f6f" Mar 20 18:32:05 crc kubenswrapper[4795]: I0320 18:32:05.341606 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:32:05 crc kubenswrapper[4795]: I0320 18:32:05.348420 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:32:07 crc kubenswrapper[4795]: I0320 18:32:07.259386 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:07 crc kubenswrapper[4795]: E0320 18:32:07.261174 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:07 crc kubenswrapper[4795]: I0320 18:32:07.262278 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" path="/var/lib/kubelet/pods/654cb8e4-7fd7-4e3e-955a-a71906ccfb79/volumes" Mar 20 18:32:12 crc kubenswrapper[4795]: I0320 18:32:12.183375 4795 scope.go:117] "RemoveContainer" containerID="43b386b54b2c5ae34c509074586a617552173f2543677683ef8d11caf140f2f9" Mar 20 18:32:20 crc kubenswrapper[4795]: I0320 18:32:20.252538 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:20 crc kubenswrapper[4795]: E0320 18:32:20.253426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:32 crc kubenswrapper[4795]: I0320 18:32:32.252470 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:32 crc kubenswrapper[4795]: E0320 18:32:32.253638 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:43 crc kubenswrapper[4795]: I0320 18:32:43.253587 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:43 crc kubenswrapper[4795]: E0320 18:32:43.254369 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:54 crc kubenswrapper[4795]: I0320 18:32:54.252721 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:54 crc kubenswrapper[4795]: E0320 18:32:54.253632 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:06 crc kubenswrapper[4795]: I0320 18:33:06.252728 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:06 crc kubenswrapper[4795]: E0320 18:33:06.253953 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:17 crc kubenswrapper[4795]: I0320 18:33:17.263739 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:17 crc kubenswrapper[4795]: E0320 18:33:17.264956 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:30 crc kubenswrapper[4795]: I0320 18:33:30.252157 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:30 crc kubenswrapper[4795]: E0320 18:33:30.253426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:44 crc kubenswrapper[4795]: I0320 18:33:44.253128 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:44 crc kubenswrapper[4795]: E0320 18:33:44.254362 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:58 crc kubenswrapper[4795]: I0320 18:33:58.252662 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:58 crc kubenswrapper[4795]: E0320 18:33:58.254142 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.145793 4795 generic.go:334] "Generic (PLEG): container finished" podID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" exitCode=0 Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.145837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerDied","Data":"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553"} Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.146422 4795 scope.go:117] "RemoveContainer" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.264781 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2wfg_must-gather-gb8cc_a508da41-3cdb-4b99-b14e-a917c5153c72/gather/0.log" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.152088 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:34:00 crc kubenswrapper[4795]: E0320 18:34:00.153019 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerName="oc" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.153049 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerName="oc" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.153350 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerName="oc" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.154132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.156290 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.156578 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.159790 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.161309 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.197909 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"auto-csr-approver-29567194-t7ml6\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.299980 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"auto-csr-approver-29567194-t7ml6\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.318270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"auto-csr-approver-29567194-t7ml6\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.477882 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.938992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:34:01 crc kubenswrapper[4795]: I0320 18:34:01.165954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" event={"ID":"7ce7186d-a505-4b16-ae93-2d95886d5f2d","Type":"ContainerStarted","Data":"04c81d063f980e1aa073cef625e10b89bbeec4092afbf2754d0a2463270ef2a3"} Mar 20 18:34:04 crc kubenswrapper[4795]: I0320 18:34:04.192337 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerID="640523aab60a0eb7070cfd04917b68ca823bf4802ffb825df958dcc7af70501e" exitCode=0 Mar 20 18:34:04 crc kubenswrapper[4795]: I0320 18:34:04.192444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" event={"ID":"7ce7186d-a505-4b16-ae93-2d95886d5f2d","Type":"ContainerDied","Data":"640523aab60a0eb7070cfd04917b68ca823bf4802ffb825df958dcc7af70501e"} Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.622061 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.701738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.710904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2" (OuterVolumeSpecName: "kube-api-access-ljfk2") pod "7ce7186d-a505-4b16-ae93-2d95886d5f2d" (UID: "7ce7186d-a505-4b16-ae93-2d95886d5f2d"). InnerVolumeSpecName "kube-api-access-ljfk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.804494 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") on node \"crc\" DevicePath \"\"" Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.215826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" event={"ID":"7ce7186d-a505-4b16-ae93-2d95886d5f2d","Type":"ContainerDied","Data":"04c81d063f980e1aa073cef625e10b89bbeec4092afbf2754d0a2463270ef2a3"} Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.215864 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c81d063f980e1aa073cef625e10b89bbeec4092afbf2754d0a2463270ef2a3" Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.215874 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.749991 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.763657 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:34:07 crc kubenswrapper[4795]: I0320 18:34:07.267855 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" path="/var/lib/kubelet/pods/551830bd-5613-42fb-b4ad-b1c6c6a0b09c/volumes" Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.331837 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.332628 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" containerID="cri-o://752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" gracePeriod=2 Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.340517 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.933546 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2wfg_must-gather-gb8cc_a508da41-3cdb-4b99-b14e-a917c5153c72/copy/0.log" Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.934438 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.067342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"a508da41-3cdb-4b99-b14e-a917c5153c72\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.067625 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"a508da41-3cdb-4b99-b14e-a917c5153c72\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.082264 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6" (OuterVolumeSpecName: "kube-api-access-wq4p6") pod "a508da41-3cdb-4b99-b14e-a917c5153c72" (UID: "a508da41-3cdb-4b99-b14e-a917c5153c72"). InnerVolumeSpecName "kube-api-access-wq4p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.169733 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") on node \"crc\" DevicePath \"\"" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.237619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a508da41-3cdb-4b99-b14e-a917c5153c72" (UID: "a508da41-3cdb-4b99-b14e-a917c5153c72"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.250763 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2wfg_must-gather-gb8cc_a508da41-3cdb-4b99-b14e-a917c5153c72/copy/0.log" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.251150 4795 generic.go:334] "Generic (PLEG): container finished" podID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" exitCode=143 Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.251298 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.261756 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" path="/var/lib/kubelet/pods/a508da41-3cdb-4b99-b14e-a917c5153c72/volumes" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.262389 4795 scope.go:117] "RemoveContainer" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.273937 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.285324 4795 scope.go:117] "RemoveContainer" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.324469 4795 scope.go:117] "RemoveContainer" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" Mar 20 18:34:09 crc kubenswrapper[4795]: E0320 18:34:09.324899 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11\": container with ID starting with 752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11 not found: ID does not exist" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.324947 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11"} err="failed to get container status \"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11\": rpc error: code = NotFound desc = could not find container \"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11\": container with ID starting with 752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11 not found: ID does not exist" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.325027 4795 scope.go:117] "RemoveContainer" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:34:09 crc kubenswrapper[4795]: E0320 18:34:09.325342 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553\": container with ID starting with fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553 not found: ID does not exist" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.325389 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553"} err="failed to get container status \"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553\": rpc error: code = NotFound desc = could not find container \"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553\": container with ID starting with fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553 not found: ID does not exist" Mar 20 18:34:12 crc kubenswrapper[4795]: I0320 18:34:12.299936 4795 scope.go:117] "RemoveContainer" containerID="ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8" Mar 20 18:34:12 crc kubenswrapper[4795]: I0320 18:34:12.322681 4795 scope.go:117] "RemoveContainer" containerID="29253cb593d65e36df8393e9b2e7d2df325902972a57f35ad5e0d8767eaa777e" Mar 20 18:34:13 crc kubenswrapper[4795]: I0320 18:34:13.253328 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:13 crc kubenswrapper[4795]: E0320 18:34:13.255305 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:34:26 crc kubenswrapper[4795]: I0320 18:34:26.252216 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:26 crc kubenswrapper[4795]: E0320 18:34:26.253362 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:34:39 crc kubenswrapper[4795]: I0320 18:34:39.252898 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:39 crc kubenswrapper[4795]: E0320 18:34:39.253552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:34:51 crc kubenswrapper[4795]: I0320 18:34:51.253288 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:51 crc kubenswrapper[4795]: E0320 18:34:51.254719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:03 crc kubenswrapper[4795]: I0320 18:35:03.252768 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:03 crc kubenswrapper[4795]: E0320 18:35:03.253703 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:12 crc kubenswrapper[4795]: I0320 18:35:12.442123 4795 scope.go:117] "RemoveContainer" containerID="740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3" Mar 20 18:35:15 crc kubenswrapper[4795]: I0320 18:35:15.257471 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:15 crc kubenswrapper[4795]: E0320 18:35:15.258418 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.747032 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:19 crc kubenswrapper[4795]: E0320 18:35:19.748201 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerName="oc" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748222 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerName="oc" Mar 20 18:35:19 crc kubenswrapper[4795]: E0320 18:35:19.748265 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="gather" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748279 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="gather" Mar 20 18:35:19 crc kubenswrapper[4795]: E0320 18:35:19.748320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748333 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748651 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748723 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerName="oc" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748755 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="gather" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.751535 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.759355 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.861110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.861225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.861367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.964095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.985780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:20 crc kubenswrapper[4795]: I0320 18:35:20.082863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:20 crc kubenswrapper[4795]: I0320 18:35:20.560567 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.005532 4795 generic.go:334] "Generic (PLEG): container finished" podID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" exitCode=0 Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.005635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242"} Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.005917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerStarted","Data":"ef650a02c257831f7b67cbc0f959703c18d4a6a48e121217bcbe9ea59e4d7d58"} Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.007929 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.039765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerStarted","Data":"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081"} Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.516673 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.518742 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.530423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.530616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.530661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.548033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.659592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.845419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:24 crc kubenswrapper[4795]: I0320 18:35:24.049596 4795 generic.go:334] "Generic (PLEG): container finished" podID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" exitCode=0 Mar 20 18:35:24 crc kubenswrapper[4795]: I0320 18:35:24.049995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081"} Mar 20 18:35:24 crc kubenswrapper[4795]: I0320 18:35:24.844562 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:24 crc kubenswrapper[4795]: W0320 18:35:24.848392 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb25b0297_a790_4365_aa94_c551db2f983d.slice/crio-a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e WatchSource:0}: Error finding container a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e: Status 404 returned error can't find the container with id a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.059981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerStarted","Data":"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7"} Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.061486 4795 generic.go:334] "Generic (PLEG): container finished" podID="b25b0297-a790-4365-aa94-c551db2f983d" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" exitCode=0 Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.061512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c"} Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.061530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerStarted","Data":"a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e"} Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.083652 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sk5fl" podStartSLOduration=2.589637596 podStartE2EDuration="6.083632332s" podCreationTimestamp="2026-03-20 18:35:19 +0000 UTC" firstStartedPulling="2026-03-20 18:35:21.007650689 +0000 UTC m=+4664.465682230" lastFinishedPulling="2026-03-20 18:35:24.501645425 +0000 UTC m=+4667.959676966" observedRunningTime="2026-03-20 18:35:25.076528411 +0000 UTC m=+4668.534559952" watchObservedRunningTime="2026-03-20 18:35:25.083632332 +0000 UTC m=+4668.541663873" Mar 20 18:35:27 crc kubenswrapper[4795]: I0320 18:35:27.082714 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerStarted","Data":"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e"} Mar 20 18:35:28 crc kubenswrapper[4795]: I0320 18:35:28.097228 4795 generic.go:334] "Generic (PLEG): container finished" podID="b25b0297-a790-4365-aa94-c551db2f983d" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" exitCode=0 Mar 20 18:35:28 crc kubenswrapper[4795]: I0320 18:35:28.097302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e"} Mar 20 18:35:28 crc kubenswrapper[4795]: I0320 18:35:28.252410 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:28 crc kubenswrapper[4795]: E0320 18:35:28.252991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:29 crc kubenswrapper[4795]: I0320 18:35:29.106233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerStarted","Data":"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28"} Mar 20 18:35:29 crc kubenswrapper[4795]: I0320 18:35:29.125991 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x5256" podStartSLOduration=2.673062489 podStartE2EDuration="6.125976022s" podCreationTimestamp="2026-03-20 18:35:23 +0000 UTC" firstStartedPulling="2026-03-20 18:35:25.062846557 +0000 UTC m=+4668.520878098" lastFinishedPulling="2026-03-20 18:35:28.51576009 +0000 UTC m=+4671.973791631" observedRunningTime="2026-03-20 18:35:29.123432573 +0000 UTC m=+4672.581464114" watchObservedRunningTime="2026-03-20 18:35:29.125976022 +0000 UTC m=+4672.584007563" Mar 20 18:35:30 crc kubenswrapper[4795]: I0320 18:35:30.083958 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:30 crc kubenswrapper[4795]: I0320 18:35:30.084272 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:31 crc kubenswrapper[4795]: I0320 18:35:31.133675 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sk5fl" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" probeResult="failure" output=< Mar 20 18:35:31 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:35:31 crc kubenswrapper[4795]: > Mar 20 18:35:33 crc kubenswrapper[4795]: I0320 18:35:33.846031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:33 crc kubenswrapper[4795]: I0320 18:35:33.846892 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:33 crc kubenswrapper[4795]: I0320 18:35:33.935365 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:34 crc kubenswrapper[4795]: I0320 18:35:34.205516 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:34 crc kubenswrapper[4795]: I0320 18:35:34.278806 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:36 crc kubenswrapper[4795]: I0320 18:35:36.174087 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x5256" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" containerID="cri-o://18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" gracePeriod=2 Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.183361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185443 4795 generic.go:334] "Generic (PLEG): container finished" podID="b25b0297-a790-4365-aa94-c551db2f983d" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" exitCode=0 Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28"} Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e"} Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185531 4795 scope.go:117] "RemoveContainer" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.219513 4795 scope.go:117] "RemoveContainer" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.236783 4795 scope.go:117] "RemoveContainer" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285103 4795 scope.go:117] "RemoveContainer" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" Mar 20 18:35:37 crc kubenswrapper[4795]: E0320 18:35:37.285505 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28\": container with ID starting with 18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28 not found: ID does not exist" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285537 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28"} err="failed to get container status \"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28\": rpc error: code = NotFound desc = could not find container \"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28\": container with ID starting with 18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28 not found: ID does not exist" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285556 4795 scope.go:117] "RemoveContainer" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" Mar 20 18:35:37 crc kubenswrapper[4795]: E0320 18:35:37.285802 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e\": container with ID starting with 316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e not found: ID does not exist" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285824 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e"} err="failed to get container status \"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e\": rpc error: code = NotFound desc = could not find container \"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e\": container with ID starting with 316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e not found: ID does not exist" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285838 4795 scope.go:117] "RemoveContainer" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" Mar 20 18:35:37 crc kubenswrapper[4795]: E0320 18:35:37.286396 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c\": container with ID starting with ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c not found: ID does not exist" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.286493 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c"} err="failed to get container status \"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c\": rpc error: code = NotFound desc = could not find container \"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c\": container with ID starting with ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c not found: ID does not exist" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.301853 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"b25b0297-a790-4365-aa94-c551db2f983d\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.302047 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"b25b0297-a790-4365-aa94-c551db2f983d\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.302160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"b25b0297-a790-4365-aa94-c551db2f983d\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.309330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t" (OuterVolumeSpecName: "kube-api-access-rjt5t") pod "b25b0297-a790-4365-aa94-c551db2f983d" (UID: "b25b0297-a790-4365-aa94-c551db2f983d"). InnerVolumeSpecName "kube-api-access-rjt5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.315767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities" (OuterVolumeSpecName: "utilities") pod "b25b0297-a790-4365-aa94-c551db2f983d" (UID: "b25b0297-a790-4365-aa94-c551db2f983d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.361558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b25b0297-a790-4365-aa94-c551db2f983d" (UID: "b25b0297-a790-4365-aa94-c551db2f983d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.403861 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.403908 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.403919 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:38 crc kubenswrapper[4795]: I0320 18:35:38.197874 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:38 crc kubenswrapper[4795]: I0320 18:35:38.242709 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:38 crc kubenswrapper[4795]: I0320 18:35:38.258140 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:39 crc kubenswrapper[4795]: I0320 18:35:39.268805 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b25b0297-a790-4365-aa94-c551db2f983d" path="/var/lib/kubelet/pods/b25b0297-a790-4365-aa94-c551db2f983d/volumes" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.140214 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.201639 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.252987 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:40 crc kubenswrapper[4795]: E0320 18:35:40.253586 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.590871 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.268049 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sk5fl" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" containerID="cri-o://929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" gracePeriod=2 Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.809955 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.885112 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.886901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.886976 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.888339 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities" (OuterVolumeSpecName: "utilities") pod "31c30b03-2316-4ccf-bb7d-379da1b6ba23" (UID: "31c30b03-2316-4ccf-bb7d-379da1b6ba23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.895495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv" (OuterVolumeSpecName: "kube-api-access-xpvbv") pod "31c30b03-2316-4ccf-bb7d-379da1b6ba23" (UID: "31c30b03-2316-4ccf-bb7d-379da1b6ba23"). InnerVolumeSpecName "kube-api-access-xpvbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.989797 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.989829 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.018354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31c30b03-2316-4ccf-bb7d-379da1b6ba23" (UID: "31c30b03-2316-4ccf-bb7d-379da1b6ba23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.092929 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279517 4795 generic.go:334] "Generic (PLEG): container finished" podID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" exitCode=0 Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7"} Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"ef650a02c257831f7b67cbc0f959703c18d4a6a48e121217bcbe9ea59e4d7d58"} Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279612 4795 scope.go:117] "RemoveContainer" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.314153 4795 scope.go:117] "RemoveContainer" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.337608 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.344999 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.353527 4795 scope.go:117] "RemoveContainer" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.377970 4795 scope.go:117] "RemoveContainer" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.380185 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7\": container with ID starting with 929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7 not found: ID does not exist" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380216 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7"} err="failed to get container status \"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7\": rpc error: code = NotFound desc = could not find container \"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7\": container with ID starting with 929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7 not found: ID does not exist" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380255 4795 scope.go:117] "RemoveContainer" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.380597 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081\": container with ID starting with 6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081 not found: ID does not exist" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380640 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081"} err="failed to get container status \"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081\": rpc error: code = NotFound desc = could not find container \"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081\": container with ID starting with 6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081 not found: ID does not exist" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380666 4795 scope.go:117] "RemoveContainer" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.380959 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242\": container with ID starting with bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242 not found: ID does not exist" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380981 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242"} err="failed to get container status \"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242\": rpc error: code = NotFound desc = could not find container \"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242\": container with ID starting with bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242 not found: ID does not exist" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.486735 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c30b03_2316_4ccf_bb7d_379da1b6ba23.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c30b03_2316_4ccf_bb7d_379da1b6ba23.slice/crio-ef650a02c257831f7b67cbc0f959703c18d4a6a48e121217bcbe9ea59e4d7d58\": RecentStats: unable to find data in memory cache]" Mar 20 18:35:43 crc kubenswrapper[4795]: I0320 18:35:43.269272 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" path="/var/lib/kubelet/pods/31c30b03-2316-4ccf-bb7d-379da1b6ba23/volumes" Mar 20 18:35:53 crc kubenswrapper[4795]: I0320 18:35:53.252527 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:53 crc kubenswrapper[4795]: E0320 18:35:53.253321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.171280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.174515 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.174581 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175307 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175379 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175410 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175425 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175460 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175472 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175529 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175542 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175580 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175591 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.176469 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.176512 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.180193 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.185474 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.185884 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.186343 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.197334 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.203409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"auto-csr-approver-29567196-t9x4q\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.305095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"auto-csr-approver-29567196-t9x4q\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.324418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"auto-csr-approver-29567196-t9x4q\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.504952 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.955759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:36:00 crc kubenswrapper[4795]: W0320 18:36:00.960101 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7203100a_018c_4662_a760_a16bd5c6322d.slice/crio-556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8 WatchSource:0}: Error finding container 556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8: Status 404 returned error can't find the container with id 556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8 Mar 20 18:36:01 crc kubenswrapper[4795]: I0320 18:36:01.783985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" event={"ID":"7203100a-018c-4662-a760-a16bd5c6322d","Type":"ContainerStarted","Data":"556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8"} Mar 20 18:36:02 crc kubenswrapper[4795]: I0320 18:36:02.793404 4795 generic.go:334] "Generic (PLEG): container finished" podID="7203100a-018c-4662-a760-a16bd5c6322d" containerID="dd786a69be248d53a6715fb536c79a06b01be09807a6bb21bbca9e7786db827c" exitCode=0 Mar 20 18:36:02 crc kubenswrapper[4795]: I0320 18:36:02.793858 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" event={"ID":"7203100a-018c-4662-a760-a16bd5c6322d","Type":"ContainerDied","Data":"dd786a69be248d53a6715fb536c79a06b01be09807a6bb21bbca9e7786db827c"} Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.251234 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.389246 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"7203100a-018c-4662-a760-a16bd5c6322d\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.399680 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd" (OuterVolumeSpecName: "kube-api-access-wfqdd") pod "7203100a-018c-4662-a760-a16bd5c6322d" (UID: "7203100a-018c-4662-a760-a16bd5c6322d"). InnerVolumeSpecName "kube-api-access-wfqdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.495388 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") on node \"crc\" DevicePath \"\"" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.818976 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" event={"ID":"7203100a-018c-4662-a760-a16bd5c6322d","Type":"ContainerDied","Data":"556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8"} Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.819037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.819054 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8" Mar 20 18:36:05 crc kubenswrapper[4795]: I0320 18:36:05.329166 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:36:05 crc kubenswrapper[4795]: I0320 18:36:05.344287 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:36:07 crc kubenswrapper[4795]: I0320 18:36:07.270156 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" path="/var/lib/kubelet/pods/a7e72f01-1ab6-47a2-99d2-ff2778039c34/volumes" Mar 20 18:36:08 crc kubenswrapper[4795]: I0320 18:36:08.252276 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:08 crc kubenswrapper[4795]: E0320 18:36:08.252969 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:12 crc kubenswrapper[4795]: I0320 18:36:12.506520 4795 scope.go:117] "RemoveContainer" containerID="e6a437e3ef5671482fc87ddf7b0443a4a6151e38d08d0d94800ebfc859f95be2" Mar 20 18:36:22 crc kubenswrapper[4795]: I0320 18:36:22.252399 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:22 crc kubenswrapper[4795]: E0320 18:36:22.253291 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:34 crc kubenswrapper[4795]: I0320 18:36:34.252381 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:34 crc kubenswrapper[4795]: E0320 18:36:34.253256 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:48 crc kubenswrapper[4795]: I0320 18:36:48.253428 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:49 crc kubenswrapper[4795]: I0320 18:36:49.320799 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e"} Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.860655 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:36:52 crc kubenswrapper[4795]: E0320 18:36:52.861424 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7203100a-018c-4662-a760-a16bd5c6322d" containerName="oc" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.861436 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7203100a-018c-4662-a760-a16bd5c6322d" containerName="oc" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.861619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7203100a-018c-4662-a760-a16bd5c6322d" containerName="oc" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.862579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.872161 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4zsvz"/"default-dockercfg-kfszv" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.872258 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4zsvz"/"kube-root-ca.crt" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.872295 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4zsvz"/"openshift-service-ca.crt" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.881849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.058842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.059306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.161442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.161533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.162104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.180812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.193320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.704302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.377715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerStarted","Data":"7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22"} Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.378350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerStarted","Data":"f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f"} Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.378373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerStarted","Data":"b6390a675269ff059a18fc9962b84b8fffd34764b948261b176265f106e342a0"} Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.409527 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" podStartSLOduration=2.409502483 podStartE2EDuration="2.409502483s" podCreationTimestamp="2026-03-20 18:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:36:54.396664085 +0000 UTC m=+4757.854695636" watchObservedRunningTime="2026-03-20 18:36:54.409502483 +0000 UTC m=+4757.867534054" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.581436 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-stnvk"] Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.583065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.769155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.769225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.871520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.871586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.872025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.893198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.899570 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: W0320 18:36:57.941323 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b756ee6_eaba_4ef9_8aeb_932fa022ff67.slice/crio-0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a WatchSource:0}: Error finding container 0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a: Status 404 returned error can't find the container with id 0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a Mar 20 18:36:58 crc kubenswrapper[4795]: I0320 18:36:58.421281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" event={"ID":"1b756ee6-eaba-4ef9-8aeb-932fa022ff67","Type":"ContainerStarted","Data":"aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b"} Mar 20 18:36:58 crc kubenswrapper[4795]: I0320 18:36:58.421936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" event={"ID":"1b756ee6-eaba-4ef9-8aeb-932fa022ff67","Type":"ContainerStarted","Data":"0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a"} Mar 20 18:37:37 crc kubenswrapper[4795]: I0320 18:37:37.791763 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerID="aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b" exitCode=0 Mar 20 18:37:37 crc kubenswrapper[4795]: I0320 18:37:37.791882 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" event={"ID":"1b756ee6-eaba-4ef9-8aeb-932fa022ff67","Type":"ContainerDied","Data":"aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b"} Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.905574 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.921588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.921771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.924259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host" (OuterVolumeSpecName: "host") pod "1b756ee6-eaba-4ef9-8aeb-932fa022ff67" (UID: "1b756ee6-eaba-4ef9-8aeb-932fa022ff67"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.929268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7" (OuterVolumeSpecName: "kube-api-access-wghz7") pod "1b756ee6-eaba-4ef9-8aeb-932fa022ff67" (UID: "1b756ee6-eaba-4ef9-8aeb-932fa022ff67"). InnerVolumeSpecName "kube-api-access-wghz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.943902 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-stnvk"] Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.951449 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-stnvk"] Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.025072 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.025121 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.278065 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" path="/var/lib/kubelet/pods/1b756ee6-eaba-4ef9-8aeb-932fa022ff67/volumes" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.811023 4795 scope.go:117] "RemoveContainer" containerID="aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.811048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.126554 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-t9n52"] Mar 20 18:37:40 crc kubenswrapper[4795]: E0320 18:37:40.127028 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerName="container-00" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.127045 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerName="container-00" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.127297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerName="container-00" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.128195 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.247037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.247114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.348780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.348834 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.348927 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.369454 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.449015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.819953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" event={"ID":"a2ef605a-8561-4538-9d09-b6635a813341","Type":"ContainerStarted","Data":"cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58"} Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.820009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" event={"ID":"a2ef605a-8561-4538-9d09-b6635a813341","Type":"ContainerStarted","Data":"16a3231887d7ca689198aece22c1fca21228396e2e6bf1e6133f03b90cbf7649"} Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.836108 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" podStartSLOduration=0.836090887 podStartE2EDuration="836.090887ms" podCreationTimestamp="2026-03-20 18:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:37:40.830584057 +0000 UTC m=+4804.288615598" watchObservedRunningTime="2026-03-20 18:37:40.836090887 +0000 UTC m=+4804.294122418" Mar 20 18:37:41 crc kubenswrapper[4795]: I0320 18:37:41.844126 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2ef605a-8561-4538-9d09-b6635a813341" containerID="cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58" exitCode=0 Mar 20 18:37:41 crc kubenswrapper[4795]: I0320 18:37:41.844140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" event={"ID":"a2ef605a-8561-4538-9d09-b6635a813341","Type":"ContainerDied","Data":"cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58"} Mar 20 18:37:42 crc kubenswrapper[4795]: I0320 18:37:42.953249 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099001 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"a2ef605a-8561-4538-9d09-b6635a813341\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"a2ef605a-8561-4538-9d09-b6635a813341\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host" (OuterVolumeSpecName: "host") pod "a2ef605a-8561-4538-9d09-b6635a813341" (UID: "a2ef605a-8561-4538-9d09-b6635a813341"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099653 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.111024 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c" (OuterVolumeSpecName: "kube-api-access-xbk5c") pod "a2ef605a-8561-4538-9d09-b6635a813341" (UID: "a2ef605a-8561-4538-9d09-b6635a813341"). InnerVolumeSpecName "kube-api-access-xbk5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.118017 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-t9n52"] Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.128215 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-t9n52"] Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.201049 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.262776 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ef605a-8561-4538-9d09-b6635a813341" path="/var/lib/kubelet/pods/a2ef605a-8561-4538-9d09-b6635a813341/volumes" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.857136 4795 scope.go:117] "RemoveContainer" containerID="cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.857189 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.275542 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-sz4jb"] Mar 20 18:37:44 crc kubenswrapper[4795]: E0320 18:37:44.276064 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ef605a-8561-4538-9d09-b6635a813341" containerName="container-00" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.276078 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ef605a-8561-4538-9d09-b6635a813341" containerName="container-00" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.276244 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ef605a-8561-4538-9d09-b6635a813341" containerName="container-00" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.276940 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.421117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.421261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.523188 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.523367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.523476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.540745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.628403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.867490 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" event={"ID":"dede02ab-ca23-4adf-9816-ac5ba6aa81b5","Type":"ContainerStarted","Data":"bd1eb6bb2bc7f7f800ad219a622e9dcf1061f7d09315438bdb06c0fefccbe684"} Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.883365 4795 generic.go:334] "Generic (PLEG): container finished" podID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerID="492c90a8d086b5d0bece8209730cbc8cb52f67859378e5062d5adb6e9f1149f8" exitCode=0 Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.883469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" event={"ID":"dede02ab-ca23-4adf-9816-ac5ba6aa81b5","Type":"ContainerDied","Data":"492c90a8d086b5d0bece8209730cbc8cb52f67859378e5062d5adb6e9f1149f8"} Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.924046 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-sz4jb"] Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.934549 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-sz4jb"] Mar 20 18:37:46 crc kubenswrapper[4795]: I0320 18:37:46.982088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.168884 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.169315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.169393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host" (OuterVolumeSpecName: "host") pod "dede02ab-ca23-4adf-9816-ac5ba6aa81b5" (UID: "dede02ab-ca23-4adf-9816-ac5ba6aa81b5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.169996 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.180516 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r" (OuterVolumeSpecName: "kube-api-access-lfm2r") pod "dede02ab-ca23-4adf-9816-ac5ba6aa81b5" (UID: "dede02ab-ca23-4adf-9816-ac5ba6aa81b5"). InnerVolumeSpecName "kube-api-access-lfm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.264451 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" path="/var/lib/kubelet/pods/dede02ab-ca23-4adf-9816-ac5ba6aa81b5/volumes" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.278008 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.901210 4795 scope.go:117] "RemoveContainer" containerID="492c90a8d086b5d0bece8209730cbc8cb52f67859378e5062d5adb6e9f1149f8" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.901244 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.161569 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:38:00 crc kubenswrapper[4795]: E0320 18:38:00.164133 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerName="container-00" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.164306 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerName="container-00" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.164739 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerName="container-00" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.165632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.172269 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.172462 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.172721 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.176851 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.242642 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"auto-csr-approver-29567198-hmxc4\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.344618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"auto-csr-approver-29567198-hmxc4\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.362498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"auto-csr-approver-29567198-hmxc4\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.491888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.966523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:38:01 crc kubenswrapper[4795]: I0320 18:38:01.020836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" event={"ID":"73777af0-dee3-47d4-a9d2-a48649e84e4d","Type":"ContainerStarted","Data":"9c6787e09d922da99c9dc942ad9c34c154b0df26e872c3ec0399f194f3baab37"} Mar 20 18:38:03 crc kubenswrapper[4795]: I0320 18:38:03.039455 4795 generic.go:334] "Generic (PLEG): container finished" podID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerID="6303fb5093a0bb0022b32e4ef548448b867a34002a11dd9fd46d7dd786dcfd17" exitCode=0 Mar 20 18:38:03 crc kubenswrapper[4795]: I0320 18:38:03.039556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" event={"ID":"73777af0-dee3-47d4-a9d2-a48649e84e4d","Type":"ContainerDied","Data":"6303fb5093a0bb0022b32e4ef548448b867a34002a11dd9fd46d7dd786dcfd17"} Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.436535 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.624848 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"73777af0-dee3-47d4-a9d2-a48649e84e4d\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.636767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb" (OuterVolumeSpecName: "kube-api-access-tvvxb") pod "73777af0-dee3-47d4-a9d2-a48649e84e4d" (UID: "73777af0-dee3-47d4-a9d2-a48649e84e4d"). InnerVolumeSpecName "kube-api-access-tvvxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.728233 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") on node \"crc\" DevicePath \"\"" Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.073031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" event={"ID":"73777af0-dee3-47d4-a9d2-a48649e84e4d","Type":"ContainerDied","Data":"9c6787e09d922da99c9dc942ad9c34c154b0df26e872c3ec0399f194f3baab37"} Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.073079 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c6787e09d922da99c9dc942ad9c34c154b0df26e872c3ec0399f194f3baab37" Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.073141 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.508593 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.518435 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:38:07 crc kubenswrapper[4795]: I0320 18:38:07.278162 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" path="/var/lib/kubelet/pods/6cec1e8e-999e-44e2-a9b5-387a10c5de11/volumes" Mar 20 18:38:12 crc kubenswrapper[4795]: I0320 18:38:12.904486 4795 scope.go:117] "RemoveContainer" containerID="aef631dab3c0134ed5918f9313f1d6153c072f4d69296d8ee64df68955ab56a0" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.173511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.294024 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api-log/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.364359 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.507661 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener-log/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.569840 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker-log/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.580669 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.876555 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-central-agent/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.962070 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-notification-agent/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.059775 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/proxy-httpd/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.065111 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/sg-core/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.069677 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-65dps_0708214e-e711-465a-a54e-97a462b2777e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.277205 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api-log/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.297836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.471851 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/cinder-scheduler/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.529240 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/probe/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.750270 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hprm9_2bad20c9-d77a-4c30-8fa2-979c05697cf4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.947287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm_3d666090-1065-4b2d-9ac6-b84776b53d0a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.042966 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.218691 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.375468 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/dnsmasq-dns/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.512497 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k_b0af5324-4ba3-4a12-9fdb-b467918ba19d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.572298 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-log/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.582225 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-httpd/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.718152 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-httpd/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.792406 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-log/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.046381 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.195604 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5_0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.529774 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon-log/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.782433 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-t26vc_cdfe5ffc-ab15-4277-966f-f506e725e8b1/keystone-cron/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.982182 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_72605c7d-99df-450f-900b-3022b0520149/kube-state-metrics/0.log" Mar 20 18:38:37 crc kubenswrapper[4795]: I0320 18:38:37.351332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85b996ff68-fdzxg_7b20a034-11f6-40ad-9447-32c49f705c07/keystone-api/0.log" Mar 20 18:38:37 crc kubenswrapper[4795]: I0320 18:38:37.420584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5rj55_20b330a0-830c-419e-81fe-a36dd1a32cc2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:38 crc kubenswrapper[4795]: I0320 18:38:38.038108 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-httpd/0.log" Mar 20 18:38:38 crc kubenswrapper[4795]: I0320 18:38:38.225235 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7_e29f4857-ff0d-4806-ba09-74448200e8e2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:38 crc kubenswrapper[4795]: I0320 18:38:38.248461 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-api/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.261522 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5916e4d2-2863-4088-be97-cf368906820b/nova-cell0-conductor-conductor/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.668178 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-log/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.740429 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rp77q_b6da9d2a-e18f-4994-b8f3-6b1eb969564b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.878438 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_19c15c93-572c-4d53-b924-172f3ad29c8a/nova-cell1-conductor-conductor/0.log" Mar 20 18:38:40 crc kubenswrapper[4795]: I0320 18:38:40.148504 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d2a5e398-6d25-43b1-8c29-407af2d9348b/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:38:40 crc kubenswrapper[4795]: I0320 18:38:40.436204 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-api/0.log" Mar 20 18:38:40 crc kubenswrapper[4795]: I0320 18:38:40.452918 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-log/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.069193 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.171746 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-metadata/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.249538 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c23f56ff-eceb-4891-87e5-57ebeb7eba8d/nova-scheduler-scheduler/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.935964 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-kc4wx_709f5080-c511-4d3b-bc9c-baeec85fa245/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.162651 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/galera/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.177076 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.253329 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.482972 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/galera/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.511185 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_cf3f8aea-393e-418a-ad14-2848c8df93e9/openstackclient/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.536863 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.690736 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dnp2g_28df10bb-d6a9-47a9-9b79-0bb9665529ef/ovn-controller/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.751943 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n4gzx_85004117-20bc-474e-88f5-ce49032749ff/openstack-network-exporter/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.061136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.750085 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.754303 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovs-vswitchd/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.820255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.024917 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/openstack-network-exporter/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.137708 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/ovn-northd/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.214590 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9jw45_6c737290-0616-475b-a839-cca387d8d90d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.258501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/openstack-network-exporter/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.326850 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/ovsdbserver-nb/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.494943 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/openstack-network-exporter/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.497213 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/ovsdbserver-sb/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.819854 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-api/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.860453 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.912277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-log/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.069863 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.146765 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/rabbitmq/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.155465 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.403743 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.428544 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/rabbitmq/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.488883 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88_1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.655870 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tx6d9_d7dc5d37-6d24-48ea-acc1-2b4ed3de6936/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.704956 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk_e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.981900 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-prfq6_9cdb4943-60a1-41cc-aead-1702a4c1f68a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.004098 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j6rls_80cf5a83-936d-4789-a7bc-b91cdb0e564d/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.202040 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-server/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.291458 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-httpd/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.300062 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m8zw5_2c422574-0103-4c97-9e23-5a78c5b44e69/swift-ring-rebalance/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.495493 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-auditor/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.559941 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-reaper/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.587023 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-replicator/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.850816 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ca95ec62-fce9-4c91-bb59-fa80f512edba/memcached/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.852315 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-auditor/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.860481 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-server/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.922229 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-replicator/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.953071 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-server/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.029359 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-updater/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.077944 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-expirer/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.098230 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-auditor/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.141793 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-server/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.163889 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-replicator/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.280316 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/swift-recon-cron/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.281705 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/rsync/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.291809 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-updater/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.574628 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_caaf60a5-8c45-4831-8d26-8cf808f1da7a/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.652933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0/test-operator-logs-container/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.825872 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5_35b4aa82-d668-474b-b54d-b540190f5a6c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:48 crc kubenswrapper[4795]: I0320 18:38:48.212998 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh_d519d04c-89f1-46b7-8136-1a9596af73ac/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:39:11 crc kubenswrapper[4795]: I0320 18:39:11.300471 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:39:11 crc kubenswrapper[4795]: I0320 18:39:11.301867 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.532106 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.700329 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.709123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.744019 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.906985 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.914609 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.929123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/extract/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.192596 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-5hzvs_afefdb79-bad6-4deb-904b-515174cca414/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.312671 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-jgs27_43804d6b-2358-46fd-bf04-26b2308f8ab0/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.506655 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-dwx6n_a957ef3d-357c-4aa4-865c-533f889257d7/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.610584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rmcrf_4cdd16c5-b7d3-4c52-a286-f3555daf43d9/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.769360 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-f74p9_ded84ba8-d70a-4379-bc80-d142e5306cc7/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.082765 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-55vp5_9cba9cd3-4144-4262-82a2-f2330793aae6/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.195511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6f8b7f6fdf-lrjfh_fc0f2e63-50dd-424e-af01-3d09c9edd5b3/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.379264 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-6hsxn_84901a7b-ddbf-47d9-954f-c167cd9cd46c/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.426560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-trjt4_7a887d91-fa86-45d2-a6be-aa7326f7d544/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.732258 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-jfdzb_071f0af8-4164-4f95-b0ee-720e3b3097f3/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.894440 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-h9f9t_21481bba-04ec-47ce-95d0-fe27787a3d62/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.901597 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-bqzcz_0ffe016b-8919-4b8f-839c-669637b7accc/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.995297 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-5v5sg_0da03e08-561c-4b5f-89c7-af80c8f39f54/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.055367 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-n7cl7_d4ff6977-1303-4267-983e-3e99935f2aae/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.158647 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f557zsq_a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.310121 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b67cc5c9-vm29j_084071f5-e58b-451b-9cf5-67203ae1ba02/operator/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.518722 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b6ckg_3aeffd27-d2c7-4744-8e01-07a4db74597e/registry-server/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.721392 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-dtfmz_84a19583-b173-4fb9-8b83-d9c41a5faf79/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.791358 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-6cw7v_b47e6216-2e29-4d58-8b0c-5970aee6307b/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.963597 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-828jr_750d9405-0514-4876-821e-9ab1f6871e87/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.222875 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-jbwss_46248665-6f9f-46e0-8db7-6be8c47cf521/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.282038 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-rv5df_e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.483560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56f44579c8-px2ft_0d8b26db-957e-4c0e-bb22-42f12d5beb0b/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.498742 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-6z7j5_933bcfd5-f2d1-404f-876d-1d3da597f415/manager/0.log" Mar 20 18:39:40 crc kubenswrapper[4795]: I0320 18:39:40.096169 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-knn77_cd9b8a97-1b9d-4365-a985-a02d4078e3c2/control-plane-machine-set-operator/0.log" Mar 20 18:39:40 crc kubenswrapper[4795]: I0320 18:39:40.283482 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/machine-api-operator/0.log" Mar 20 18:39:40 crc kubenswrapper[4795]: I0320 18:39:40.294239 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/kube-rbac-proxy/0.log" Mar 20 18:39:41 crc kubenswrapper[4795]: I0320 18:39:41.300639 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:39:41 crc kubenswrapper[4795]: I0320 18:39:41.300740 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:39:54 crc kubenswrapper[4795]: I0320 18:39:54.842856 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lqmsr_5231a25a-8bda-4f72-8a81-e5a49cdc31eb/cert-manager-controller/0.log" Mar 20 18:39:54 crc kubenswrapper[4795]: I0320 18:39:54.942421 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-smr2n_7df834a3-0298-4cc9-8b4e-49ce3f51183e/cert-manager-cainjector/0.log" Mar 20 18:39:55 crc kubenswrapper[4795]: I0320 18:39:55.035754 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cff8c_88832f68-9f72-4321-8d3f-bb3e23465fdb/cert-manager-webhook/0.log" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.140381 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:40:00 crc kubenswrapper[4795]: E0320 18:40:00.141363 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerName="oc" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.141376 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerName="oc" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.141546 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerName="oc" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.142157 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.144197 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.144321 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.145705 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.148650 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.245390 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.247223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.262785 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.265271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"auto-csr-approver-29567200-sb6x8\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367627 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"auto-csr-approver-29567200-sb6x8\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.391171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"auto-csr-approver-29567200-sb6x8\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.459671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.470084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.470189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.470298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.471016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.471351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.494009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.566362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: W0320 18:40:00.923412 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b0a7d2_ac09_4b84_8083_48c33d97b032.slice/crio-e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30 WatchSource:0}: Error finding container e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30: Status 404 returned error can't find the container with id e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30 Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.924325 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:40:01 crc kubenswrapper[4795]: W0320 18:40:01.121097 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c84a44_9022_49e8_bc90_cf827381767d.slice/crio-be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46 WatchSource:0}: Error finding container be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46: Status 404 returned error can't find the container with id be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46 Mar 20 18:40:01 crc kubenswrapper[4795]: I0320 18:40:01.127916 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:01 crc kubenswrapper[4795]: I0320 18:40:01.160717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" event={"ID":"37b0a7d2-ac09-4b84-8083-48c33d97b032","Type":"ContainerStarted","Data":"e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30"} Mar 20 18:40:01 crc kubenswrapper[4795]: I0320 18:40:01.161560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerStarted","Data":"be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46"} Mar 20 18:40:02 crc kubenswrapper[4795]: I0320 18:40:02.170956 4795 generic.go:334] "Generic (PLEG): container finished" podID="91c84a44-9022-49e8-bc90-cf827381767d" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" exitCode=0 Mar 20 18:40:02 crc kubenswrapper[4795]: I0320 18:40:02.171047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9"} Mar 20 18:40:03 crc kubenswrapper[4795]: I0320 18:40:03.182670 4795 generic.go:334] "Generic (PLEG): container finished" podID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerID="f6bd6f59f3702ed1b553664f1b56ecda875d8c79fa2d467d34f79e36c2a97634" exitCode=0 Mar 20 18:40:03 crc kubenswrapper[4795]: I0320 18:40:03.182742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" event={"ID":"37b0a7d2-ac09-4b84-8083-48c33d97b032","Type":"ContainerDied","Data":"f6bd6f59f3702ed1b553664f1b56ecda875d8c79fa2d467d34f79e36c2a97634"} Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.195179 4795 generic.go:334] "Generic (PLEG): container finished" podID="91c84a44-9022-49e8-bc90-cf827381767d" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" exitCode=0 Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.195282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f"} Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.545325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.679498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"37b0a7d2-ac09-4b84-8083-48c33d97b032\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.684780 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96" (OuterVolumeSpecName: "kube-api-access-z4x96") pod "37b0a7d2-ac09-4b84-8083-48c33d97b032" (UID: "37b0a7d2-ac09-4b84-8083-48c33d97b032"). InnerVolumeSpecName "kube-api-access-z4x96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.781925 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.209752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" event={"ID":"37b0a7d2-ac09-4b84-8083-48c33d97b032","Type":"ContainerDied","Data":"e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30"} Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.209814 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.209889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.218529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerStarted","Data":"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320"} Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.240376 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hk5bn" podStartSLOduration=2.790136796 podStartE2EDuration="5.240357253s" podCreationTimestamp="2026-03-20 18:40:00 +0000 UTC" firstStartedPulling="2026-03-20 18:40:02.173233466 +0000 UTC m=+4945.631265007" lastFinishedPulling="2026-03-20 18:40:04.623453903 +0000 UTC m=+4948.081485464" observedRunningTime="2026-03-20 18:40:05.240312351 +0000 UTC m=+4948.698343922" watchObservedRunningTime="2026-03-20 18:40:05.240357253 +0000 UTC m=+4948.698388794" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.616053 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.625275 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:40:07 crc kubenswrapper[4795]: I0320 18:40:07.266209 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" path="/var/lib/kubelet/pods/7ce7186d-a505-4b16-ae93-2d95886d5f2d/volumes" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.567164 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.567606 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.618143 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.636710 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-5jfjl_d34761db-41bf-4e5f-bdca-8c25e281c924/nmstate-console-plugin/0.log" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.834831 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bsp49_e070281f-65f5-4c6d-b012-06c027393646/nmstate-handler/0.log" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.889369 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/kube-rbac-proxy/0.log" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.978065 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/nmstate-metrics/0.log" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.069532 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-dlcps_efca4120-31ef-4c52-a6da-59b33144a979/nmstate-operator/0.log" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.163923 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mjhsq_f50011ef-d180-4d84-ba10-a2da522a579d/nmstate-webhook/0.log" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.300501 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.300550 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.300589 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.301301 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.301351 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e" gracePeriod=600 Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.309083 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.365149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.274676 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e" exitCode=0 Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.274779 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e"} Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.275784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8"} Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.275804 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:40:13 crc kubenswrapper[4795]: I0320 18:40:13.032005 4795 scope.go:117] "RemoveContainer" containerID="640523aab60a0eb7070cfd04917b68ca823bf4802ffb825df958dcc7af70501e" Mar 20 18:40:13 crc kubenswrapper[4795]: I0320 18:40:13.287137 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hk5bn" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" containerID="cri-o://48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" gracePeriod=2 Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.290793 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297384 4795 generic.go:334] "Generic (PLEG): container finished" podID="91c84a44-9022-49e8-bc90-cf827381767d" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" exitCode=0 Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320"} Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46"} Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297460 4795 scope.go:117] "RemoveContainer" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297551 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.328625 4795 scope.go:117] "RemoveContainer" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359067 4795 scope.go:117] "RemoveContainer" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"91c84a44-9022-49e8-bc90-cf827381767d\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"91c84a44-9022-49e8-bc90-cf827381767d\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359565 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"91c84a44-9022-49e8-bc90-cf827381767d\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.360594 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities" (OuterVolumeSpecName: "utilities") pod "91c84a44-9022-49e8-bc90-cf827381767d" (UID: "91c84a44-9022-49e8-bc90-cf827381767d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.369159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz" (OuterVolumeSpecName: "kube-api-access-qbwhz") pod "91c84a44-9022-49e8-bc90-cf827381767d" (UID: "91c84a44-9022-49e8-bc90-cf827381767d"). InnerVolumeSpecName "kube-api-access-qbwhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.383238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91c84a44-9022-49e8-bc90-cf827381767d" (UID: "91c84a44-9022-49e8-bc90-cf827381767d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.443906 4795 scope.go:117] "RemoveContainer" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" Mar 20 18:40:14 crc kubenswrapper[4795]: E0320 18:40:14.444381 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320\": container with ID starting with 48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320 not found: ID does not exist" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444408 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320"} err="failed to get container status \"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320\": rpc error: code = NotFound desc = could not find container \"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320\": container with ID starting with 48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320 not found: ID does not exist" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444454 4795 scope.go:117] "RemoveContainer" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" Mar 20 18:40:14 crc kubenswrapper[4795]: E0320 18:40:14.444705 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f\": container with ID starting with 0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f not found: ID does not exist" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444725 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f"} err="failed to get container status \"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f\": rpc error: code = NotFound desc = could not find container \"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f\": container with ID starting with 0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f not found: ID does not exist" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444737 4795 scope.go:117] "RemoveContainer" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" Mar 20 18:40:14 crc kubenswrapper[4795]: E0320 18:40:14.444939 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9\": container with ID starting with dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9 not found: ID does not exist" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444958 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9"} err="failed to get container status \"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9\": rpc error: code = NotFound desc = could not find container \"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9\": container with ID starting with dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9 not found: ID does not exist" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.461275 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.461307 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.461319 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.629006 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.637442 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:15 crc kubenswrapper[4795]: I0320 18:40:15.270339 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c84a44-9022-49e8-bc90-cf827381767d" path="/var/lib/kubelet/pods/91c84a44-9022-49e8-bc90-cf827381767d/volumes" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.628926 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630170 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-utilities" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630194 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-utilities" Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630231 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630243 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630271 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerName="oc" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630292 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerName="oc" Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630342 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-content" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630353 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-content" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630622 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630668 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerName="oc" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.632867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.679556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.774881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.774931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.775636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.877357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.877541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.877571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.878136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.878154 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.905458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.979604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:21 crc kubenswrapper[4795]: I0320 18:40:21.479132 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.411035 4795 generic.go:334] "Generic (PLEG): container finished" podID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" exitCode=0 Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.411124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5"} Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.411477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerStarted","Data":"11bb4e2bc49835e6a2d6bce0bd30624c0604522ef2556b61ca172297f3444edc"} Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.414169 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:40:23 crc kubenswrapper[4795]: I0320 18:40:23.438894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerStarted","Data":"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2"} Mar 20 18:40:24 crc kubenswrapper[4795]: I0320 18:40:24.452425 4795 generic.go:334] "Generic (PLEG): container finished" podID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" exitCode=0 Mar 20 18:40:24 crc kubenswrapper[4795]: I0320 18:40:24.452493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2"} Mar 20 18:40:25 crc kubenswrapper[4795]: I0320 18:40:25.474912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerStarted","Data":"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974"} Mar 20 18:40:25 crc kubenswrapper[4795]: I0320 18:40:25.500554 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5hcgx" podStartSLOduration=3.045631175 podStartE2EDuration="5.500535257s" podCreationTimestamp="2026-03-20 18:40:20 +0000 UTC" firstStartedPulling="2026-03-20 18:40:22.413942247 +0000 UTC m=+4965.871973788" lastFinishedPulling="2026-03-20 18:40:24.868846329 +0000 UTC m=+4968.326877870" observedRunningTime="2026-03-20 18:40:25.498096542 +0000 UTC m=+4968.956128083" watchObservedRunningTime="2026-03-20 18:40:25.500535257 +0000 UTC m=+4968.958566798" Mar 20 18:40:30 crc kubenswrapper[4795]: I0320 18:40:30.980818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:30 crc kubenswrapper[4795]: I0320 18:40:30.981477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:31 crc kubenswrapper[4795]: I0320 18:40:31.725940 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:31 crc kubenswrapper[4795]: I0320 18:40:31.775845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:31 crc kubenswrapper[4795]: I0320 18:40:31.970653 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:33 crc kubenswrapper[4795]: I0320 18:40:33.555680 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5hcgx" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" containerID="cri-o://3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" gracePeriod=2 Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.099140 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.238274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.238581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.238698 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.241620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities" (OuterVolumeSpecName: "utilities") pod "111e8972-3b13-46b6-b3ff-fdcfb3edd832" (UID: "111e8972-3b13-46b6-b3ff-fdcfb3edd832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.247259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9" (OuterVolumeSpecName: "kube-api-access-85qs9") pod "111e8972-3b13-46b6-b3ff-fdcfb3edd832" (UID: "111e8972-3b13-46b6-b3ff-fdcfb3edd832"). InnerVolumeSpecName "kube-api-access-85qs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.296221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "111e8972-3b13-46b6-b3ff-fdcfb3edd832" (UID: "111e8972-3b13-46b6-b3ff-fdcfb3edd832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.341155 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.341187 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.341195 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.574974 4795 generic.go:334] "Generic (PLEG): container finished" podID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" exitCode=0 Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974"} Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"11bb4e2bc49835e6a2d6bce0bd30624c0604522ef2556b61ca172297f3444edc"} Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575096 4795 scope.go:117] "RemoveContainer" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575317 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.605085 4795 scope.go:117] "RemoveContainer" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.620073 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.625797 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.630888 4795 scope.go:117] "RemoveContainer" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.681803 4795 scope.go:117] "RemoveContainer" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" Mar 20 18:40:34 crc kubenswrapper[4795]: E0320 18:40:34.682202 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974\": container with ID starting with 3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974 not found: ID does not exist" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682235 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974"} err="failed to get container status \"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974\": rpc error: code = NotFound desc = could not find container \"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974\": container with ID starting with 3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974 not found: ID does not exist" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682255 4795 scope.go:117] "RemoveContainer" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" Mar 20 18:40:34 crc kubenswrapper[4795]: E0320 18:40:34.682578 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2\": container with ID starting with 4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2 not found: ID does not exist" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682600 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2"} err="failed to get container status \"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2\": rpc error: code = NotFound desc = could not find container \"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2\": container with ID starting with 4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2 not found: ID does not exist" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682613 4795 scope.go:117] "RemoveContainer" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" Mar 20 18:40:34 crc kubenswrapper[4795]: E0320 18:40:34.682972 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5\": container with ID starting with d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5 not found: ID does not exist" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682996 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5"} err="failed to get container status \"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5\": rpc error: code = NotFound desc = could not find container \"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5\": container with ID starting with d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5 not found: ID does not exist" Mar 20 18:40:35 crc kubenswrapper[4795]: I0320 18:40:35.271796 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" path="/var/lib/kubelet/pods/111e8972-3b13-46b6-b3ff-fdcfb3edd832/volumes" Mar 20 18:40:41 crc kubenswrapper[4795]: I0320 18:40:41.387446 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/kube-rbac-proxy/0.log" Mar 20 18:40:41 crc kubenswrapper[4795]: I0320 18:40:41.562641 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/controller/0.log" Mar 20 18:40:41 crc kubenswrapper[4795]: I0320 18:40:41.939159 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.121647 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.121865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.129690 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.153161 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.311954 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.368951 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.378547 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.403988 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.463974 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.538758 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.566264 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.601560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/controller/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.721164 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.752981 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.804674 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy-frr/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.913187 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/reloader/0.log" Mar 20 18:40:43 crc kubenswrapper[4795]: I0320 18:40:43.075439 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jw5dk_377dbbb7-0571-40cd-9fe3-3c86fbf4f092/frr-k8s-webhook-server/0.log" Mar 20 18:40:43 crc kubenswrapper[4795]: I0320 18:40:43.162048 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7547f4d8c8-499mj_0e8dba8d-8387-4ced-ac54-b8d5e1cf3650/manager/0.log" Mar 20 18:40:43 crc kubenswrapper[4795]: I0320 18:40:43.906123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5ffc48dc7-t9vwn_2d29ac93-da31-4834-a858-d5bd9adb28d1/webhook-server/0.log" Mar 20 18:40:44 crc kubenswrapper[4795]: I0320 18:40:44.030811 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/kube-rbac-proxy/0.log" Mar 20 18:40:44 crc kubenswrapper[4795]: I0320 18:40:44.407310 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr/0.log" Mar 20 18:40:44 crc kubenswrapper[4795]: I0320 18:40:44.483157 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/speaker/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.152076 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.381310 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.386940 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.422429 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.593994 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.601478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.607243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/extract/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.765027 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.941220 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.943590 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.972299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.103108 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.124191 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/extract/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.125405 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.268095 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.431287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.467285 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.485933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.648818 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.672365 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.910524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.108829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.126068 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.128739 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.196147 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/registry-server/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.280524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.307740 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.505410 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8x76m_a2de2777-57e1-4310-a878-1cfc1fc77e44/marketplace-operator/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.615880 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.814007 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/registry-server/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.841982 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.851742 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.851750 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.045530 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.125872 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.216217 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/registry-server/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.262841 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.453741 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.514264 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.525209 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.664115 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.699722 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:41:02 crc kubenswrapper[4795]: I0320 18:41:02.221141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/registry-server/0.log" Mar 20 18:41:22 crc kubenswrapper[4795]: E0320 18:41:22.562734 4795 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.58:37962->38.102.83.58:45419: write tcp 38.102.83.58:37962->38.102.83.58:45419: write: broken pipe Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.160729 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567202-mx5d7"] Mar 20 18:42:00 crc kubenswrapper[4795]: E0320 18:42:00.161556 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-utilities" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161569 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-utilities" Mar 20 18:42:00 crc kubenswrapper[4795]: E0320 18:42:00.161589 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161595 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" Mar 20 18:42:00 crc kubenswrapper[4795]: E0320 18:42:00.161607 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-content" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161613 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-content" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161792 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.162728 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.165651 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.165942 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.165993 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.173790 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-mx5d7"] Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.355742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"auto-csr-approver-29567202-mx5d7\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.458708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"auto-csr-approver-29567202-mx5d7\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.488356 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"auto-csr-approver-29567202-mx5d7\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.784964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:01 crc kubenswrapper[4795]: I0320 18:42:01.350122 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-mx5d7"] Mar 20 18:42:01 crc kubenswrapper[4795]: I0320 18:42:01.430287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" event={"ID":"08e41958-5524-4e61-8976-654c68baf648","Type":"ContainerStarted","Data":"91d76a03da65f2314c73d85ea90c8b152c1b273fd40f50eb606305821f6612de"} Mar 20 18:42:03 crc kubenswrapper[4795]: I0320 18:42:03.450459 4795 generic.go:334] "Generic (PLEG): container finished" podID="08e41958-5524-4e61-8976-654c68baf648" containerID="a60cfe51f17f419369391cdefbf7858f781a1f44daab783e763ce1f4f4ed5586" exitCode=0 Mar 20 18:42:03 crc kubenswrapper[4795]: I0320 18:42:03.450616 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" event={"ID":"08e41958-5524-4e61-8976-654c68baf648","Type":"ContainerDied","Data":"a60cfe51f17f419369391cdefbf7858f781a1f44daab783e763ce1f4f4ed5586"} Mar 20 18:42:04 crc kubenswrapper[4795]: I0320 18:42:04.824159 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:04 crc kubenswrapper[4795]: I0320 18:42:04.955510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"08e41958-5524-4e61-8976-654c68baf648\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " Mar 20 18:42:04 crc kubenswrapper[4795]: I0320 18:42:04.971739 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq" (OuterVolumeSpecName: "kube-api-access-2rsxq") pod "08e41958-5524-4e61-8976-654c68baf648" (UID: "08e41958-5524-4e61-8976-654c68baf648"). InnerVolumeSpecName "kube-api-access-2rsxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.058558 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") on node \"crc\" DevicePath \"\"" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.478661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" event={"ID":"08e41958-5524-4e61-8976-654c68baf648","Type":"ContainerDied","Data":"91d76a03da65f2314c73d85ea90c8b152c1b273fd40f50eb606305821f6612de"} Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.478733 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91d76a03da65f2314c73d85ea90c8b152c1b273fd40f50eb606305821f6612de" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.478796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.912937 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.923630 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:42:07 crc kubenswrapper[4795]: I0320 18:42:07.273259 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7203100a-018c-4662-a760-a16bd5c6322d" path="/var/lib/kubelet/pods/7203100a-018c-4662-a760-a16bd5c6322d/volumes" Mar 20 18:42:11 crc kubenswrapper[4795]: I0320 18:42:11.300553 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:42:11 crc kubenswrapper[4795]: I0320 18:42:11.301111 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:42:13 crc kubenswrapper[4795]: I0320 18:42:13.181879 4795 scope.go:117] "RemoveContainer" containerID="dd786a69be248d53a6715fb536c79a06b01be09807a6bb21bbca9e7786db827c" Mar 20 18:42:41 crc kubenswrapper[4795]: I0320 18:42:41.300142 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:42:41 crc kubenswrapper[4795]: I0320 18:42:41.300803 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.300582 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.301427 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.301508 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.302811 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.302925 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" gracePeriod=600 Mar 20 18:43:11 crc kubenswrapper[4795]: E0320 18:43:11.434725 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.285665 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" exitCode=0 Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.286043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8"} Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.286086 4795 scope.go:117] "RemoveContainer" containerID="55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e" Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.287011 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:12 crc kubenswrapper[4795]: E0320 18:43:12.287368 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:24 crc kubenswrapper[4795]: I0320 18:43:24.252460 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:24 crc kubenswrapper[4795]: E0320 18:43:24.253384 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.464621 4795 generic.go:334] "Generic (PLEG): container finished" podID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerID="f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f" exitCode=0 Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.464759 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerDied","Data":"f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f"} Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.465620 4795 scope.go:117] "RemoveContainer" containerID="f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f" Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.996183 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/gather/0.log" Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.253594 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:39 crc kubenswrapper[4795]: E0320 18:43:39.254328 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.929368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.929984 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" containerID="cri-o://7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22" gracePeriod=2 Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.940254 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:43:40 crc kubenswrapper[4795]: I0320 18:43:40.636345 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/copy/0.log" Mar 20 18:43:40 crc kubenswrapper[4795]: I0320 18:43:40.637155 4795 generic.go:334] "Generic (PLEG): container finished" podID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerID="7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22" exitCode=143 Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.014258 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/copy/0.log" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.014909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.180742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.180884 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.189410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp" (OuterVolumeSpecName: "kube-api-access-5wwcp") pod "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" (UID: "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e"). InnerVolumeSpecName "kube-api-access-5wwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.283196 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") on node \"crc\" DevicePath \"\"" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.451578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" (UID: "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.488406 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.648753 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/copy/0.log" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.649278 4795 scope.go:117] "RemoveContainer" containerID="7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.649348 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.674629 4795 scope.go:117] "RemoveContainer" containerID="f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f" Mar 20 18:43:43 crc kubenswrapper[4795]: I0320 18:43:43.270549 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" path="/var/lib/kubelet/pods/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/volumes" Mar 20 18:43:54 crc kubenswrapper[4795]: I0320 18:43:54.252887 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:54 crc kubenswrapper[4795]: E0320 18:43:54.253773 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.165502 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4vtjv"] Mar 20 18:44:00 crc kubenswrapper[4795]: E0320 18:44:00.166317 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e41958-5524-4e61-8976-654c68baf648" containerName="oc" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e41958-5524-4e61-8976-654c68baf648" containerName="oc" Mar 20 18:44:00 crc kubenswrapper[4795]: E0320 18:44:00.166343 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166348 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" Mar 20 18:44:00 crc kubenswrapper[4795]: E0320 18:44:00.166359 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="gather" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166365 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="gather" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166540 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="gather" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166553 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166566 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e41958-5524-4e61-8976-654c68baf648" containerName="oc" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.167164 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.173261 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.173439 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.173467 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.183771 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4vtjv"] Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.264741 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"auto-csr-approver-29567204-4vtjv\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.366503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"auto-csr-approver-29567204-4vtjv\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.399924 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"auto-csr-approver-29567204-4vtjv\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.496204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.930112 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4vtjv"] Mar 20 18:44:01 crc kubenswrapper[4795]: I0320 18:44:01.892488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" event={"ID":"88a0b3d5-0037-474d-8d0f-d79e18c3acd0","Type":"ContainerStarted","Data":"ea464badde149bfd60c15f486577ff1bcd0f13bd6337cf69ad9c7fdd6da3f530"} Mar 20 18:44:02 crc kubenswrapper[4795]: I0320 18:44:02.905280 4795 generic.go:334] "Generic (PLEG): container finished" podID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerID="77abde6c25e56b5397403eb84648ee3aa367f56e27e7524871903de3ff0f586b" exitCode=0 Mar 20 18:44:02 crc kubenswrapper[4795]: I0320 18:44:02.905352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" event={"ID":"88a0b3d5-0037-474d-8d0f-d79e18c3acd0","Type":"ContainerDied","Data":"77abde6c25e56b5397403eb84648ee3aa367f56e27e7524871903de3ff0f586b"} Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.274559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.350244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.356052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w" (OuterVolumeSpecName: "kube-api-access-b9s8w") pod "88a0b3d5-0037-474d-8d0f-d79e18c3acd0" (UID: "88a0b3d5-0037-474d-8d0f-d79e18c3acd0"). InnerVolumeSpecName "kube-api-access-b9s8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.454163 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") on node \"crc\" DevicePath \"\"" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.939614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" event={"ID":"88a0b3d5-0037-474d-8d0f-d79e18c3acd0","Type":"ContainerDied","Data":"ea464badde149bfd60c15f486577ff1bcd0f13bd6337cf69ad9c7fdd6da3f530"} Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.939661 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.939662 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea464badde149bfd60c15f486577ff1bcd0f13bd6337cf69ad9c7fdd6da3f530" Mar 20 18:44:05 crc kubenswrapper[4795]: I0320 18:44:05.345455 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:44:05 crc kubenswrapper[4795]: I0320 18:44:05.360505 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:44:07 crc kubenswrapper[4795]: I0320 18:44:07.264623 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:07 crc kubenswrapper[4795]: E0320 18:44:07.265342 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:07 crc kubenswrapper[4795]: I0320 18:44:07.266247 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" path="/var/lib/kubelet/pods/73777af0-dee3-47d4-a9d2-a48649e84e4d/volumes" Mar 20 18:44:13 crc kubenswrapper[4795]: I0320 18:44:13.324324 4795 scope.go:117] "RemoveContainer" containerID="6303fb5093a0bb0022b32e4ef548448b867a34002a11dd9fd46d7dd786dcfd17" Mar 20 18:44:21 crc kubenswrapper[4795]: I0320 18:44:21.252087 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:21 crc kubenswrapper[4795]: E0320 18:44:21.253100 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:36 crc kubenswrapper[4795]: I0320 18:44:36.253892 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:36 crc kubenswrapper[4795]: E0320 18:44:36.254647 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:47 crc kubenswrapper[4795]: I0320 18:44:47.265387 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:47 crc kubenswrapper[4795]: E0320 18:44:47.266470 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:59 crc kubenswrapper[4795]: I0320 18:44:59.252680 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:59 crc kubenswrapper[4795]: E0320 18:44:59.253381 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.176314 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2"] Mar 20 18:45:00 crc kubenswrapper[4795]: E0320 18:45:00.177325 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.177422 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.177805 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.178670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.183105 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.186232 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.191258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2"] Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.245498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.245666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.245746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.347443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.347505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.347591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.349713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.356614 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.368463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.513985 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.007148 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2"] Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.551795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerStarted","Data":"381f43dd9727e0d0e774bc9e5c1fcddd9c4d5a38fc19af2de1dd7039022c1b0a"} Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.552086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerStarted","Data":"9d0ab81b020e431a087fdff70e65f982f5a02104ccbe929ba497588992f7a468"} Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.573456 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" podStartSLOduration=1.5734310919999999 podStartE2EDuration="1.573431092s" podCreationTimestamp="2026-03-20 18:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:45:01.570094448 +0000 UTC m=+5245.028126049" watchObservedRunningTime="2026-03-20 18:45:01.573431092 +0000 UTC m=+5245.031462663" Mar 20 18:45:02 crc kubenswrapper[4795]: I0320 18:45:02.566534 4795 generic.go:334] "Generic (PLEG): container finished" podID="4478c18d-2e03-416d-b738-d06a34d5291e" containerID="381f43dd9727e0d0e774bc9e5c1fcddd9c4d5a38fc19af2de1dd7039022c1b0a" exitCode=0 Mar 20 18:45:02 crc kubenswrapper[4795]: I0320 18:45:02.566597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerDied","Data":"381f43dd9727e0d0e774bc9e5c1fcddd9c4d5a38fc19af2de1dd7039022c1b0a"} Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.588290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerDied","Data":"9d0ab81b020e431a087fdff70e65f982f5a02104ccbe929ba497588992f7a468"} Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.588952 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d0ab81b020e431a087fdff70e65f982f5a02104ccbe929ba497588992f7a468" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.876664 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.957411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"4478c18d-2e03-416d-b738-d06a34d5291e\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.957640 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"4478c18d-2e03-416d-b738-d06a34d5291e\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.957727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"4478c18d-2e03-416d-b738-d06a34d5291e\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.958992 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume" (OuterVolumeSpecName: "config-volume") pod "4478c18d-2e03-416d-b738-d06a34d5291e" (UID: "4478c18d-2e03-416d-b738-d06a34d5291e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.963393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4478c18d-2e03-416d-b738-d06a34d5291e" (UID: "4478c18d-2e03-416d-b738-d06a34d5291e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.963828 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc" (OuterVolumeSpecName: "kube-api-access-h7njc") pod "4478c18d-2e03-416d-b738-d06a34d5291e" (UID: "4478c18d-2e03-416d-b738-d06a34d5291e"). InnerVolumeSpecName "kube-api-access-h7njc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.059914 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.059946 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.059958 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.601325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.962563 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.972760 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:45:07 crc kubenswrapper[4795]: I0320 18:45:07.276412 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" path="/var/lib/kubelet/pods/06aca85b-9cb4-47ae-ad12-b1cc429c542d/volumes" Mar 20 18:45:11 crc kubenswrapper[4795]: I0320 18:45:11.252480 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:11 crc kubenswrapper[4795]: E0320 18:45:11.253532 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:13 crc kubenswrapper[4795]: I0320 18:45:13.457279 4795 scope.go:117] "RemoveContainer" containerID="1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e" Mar 20 18:45:25 crc kubenswrapper[4795]: I0320 18:45:25.252909 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:25 crc kubenswrapper[4795]: E0320 18:45:25.255912 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.943117 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:36 crc kubenswrapper[4795]: E0320 18:45:36.944200 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4478c18d-2e03-416d-b738-d06a34d5291e" containerName="collect-profiles" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.944223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4478c18d-2e03-416d-b738-d06a34d5291e" containerName="collect-profiles" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.944518 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4478c18d-2e03-416d-b738-d06a34d5291e" containerName="collect-profiles" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.946402 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.970770 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.106454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.106825 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.107088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.208792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.208903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.208939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.209540 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.209562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.235188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.280345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.781070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:37 crc kubenswrapper[4795]: W0320 18:45:37.784639 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3995cb6e_a1d8_4a7b_98ae_ae112622968f.slice/crio-1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94 WatchSource:0}: Error finding container 1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94: Status 404 returned error can't find the container with id 1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94 Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.968978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerStarted","Data":"1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94"} Mar 20 18:45:38 crc kubenswrapper[4795]: I0320 18:45:38.986124 4795 generic.go:334] "Generic (PLEG): container finished" podID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" exitCode=0 Mar 20 18:45:38 crc kubenswrapper[4795]: I0320 18:45:38.986181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f"} Mar 20 18:45:38 crc kubenswrapper[4795]: I0320 18:45:38.988210 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.252505 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:39 crc kubenswrapper[4795]: E0320 18:45:39.252908 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.922801 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.926307 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.944059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.004370 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerStarted","Data":"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd"} Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.070060 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.070362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.070428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.174043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.191002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.296490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.743086 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.017168 4795 generic.go:334] "Generic (PLEG): container finished" podID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" exitCode=0 Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.017234 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd"} Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.021462 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9212b67-95e9-4502-a467-15325eab4f0f" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" exitCode=0 Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.021504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b"} Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.021535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerStarted","Data":"4af5126384c0f89ed19f023b9ec49a9c682a00720cb7c5fda5444fa7e7ef8d5c"} Mar 20 18:45:42 crc kubenswrapper[4795]: I0320 18:45:42.030999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerStarted","Data":"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac"} Mar 20 18:45:42 crc kubenswrapper[4795]: I0320 18:45:42.034402 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerStarted","Data":"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2"} Mar 20 18:45:42 crc kubenswrapper[4795]: I0320 18:45:42.079272 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smws7" podStartSLOduration=3.519353844 podStartE2EDuration="6.07925327s" podCreationTimestamp="2026-03-20 18:45:36 +0000 UTC" firstStartedPulling="2026-03-20 18:45:38.987978136 +0000 UTC m=+5282.446009677" lastFinishedPulling="2026-03-20 18:45:41.547877562 +0000 UTC m=+5285.005909103" observedRunningTime="2026-03-20 18:45:42.067508677 +0000 UTC m=+5285.525540268" watchObservedRunningTime="2026-03-20 18:45:42.07925327 +0000 UTC m=+5285.537285061" Mar 20 18:45:43 crc kubenswrapper[4795]: I0320 18:45:43.044112 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9212b67-95e9-4502-a467-15325eab4f0f" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" exitCode=0 Mar 20 18:45:43 crc kubenswrapper[4795]: I0320 18:45:43.044960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac"} Mar 20 18:45:44 crc kubenswrapper[4795]: I0320 18:45:44.054372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerStarted","Data":"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557"} Mar 20 18:45:44 crc kubenswrapper[4795]: I0320 18:45:44.075074 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2g56c" podStartSLOduration=2.582040582 podStartE2EDuration="5.075047105s" podCreationTimestamp="2026-03-20 18:45:39 +0000 UTC" firstStartedPulling="2026-03-20 18:45:41.022939123 +0000 UTC m=+5284.480970674" lastFinishedPulling="2026-03-20 18:45:43.515945646 +0000 UTC m=+5286.973977197" observedRunningTime="2026-03-20 18:45:44.069308457 +0000 UTC m=+5287.527340008" watchObservedRunningTime="2026-03-20 18:45:44.075047105 +0000 UTC m=+5287.533078736" Mar 20 18:45:47 crc kubenswrapper[4795]: I0320 18:45:47.280930 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:47 crc kubenswrapper[4795]: I0320 18:45:47.281407 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:47 crc kubenswrapper[4795]: I0320 18:45:47.711035 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:48 crc kubenswrapper[4795]: I0320 18:45:48.145117 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:49 crc kubenswrapper[4795]: I0320 18:45:49.112475 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:50 crc kubenswrapper[4795]: I0320 18:45:50.121341 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-smws7" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" containerID="cri-o://485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" gracePeriod=2 Mar 20 18:45:50 crc kubenswrapper[4795]: I0320 18:45:50.297818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:50 crc kubenswrapper[4795]: I0320 18:45:50.297885 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.062506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133121 4795 generic.go:334] "Generic (PLEG): container finished" podID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" exitCode=0 Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133166 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2"} Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133181 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133191 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94"} Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133209 4795 scope.go:117] "RemoveContainer" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.150341 4795 scope.go:117] "RemoveContainer" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.167826 4795 scope.go:117] "RemoveContainer" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.192659 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.192817 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.192951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.193617 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities" (OuterVolumeSpecName: "utilities") pod "3995cb6e-a1d8-4a7b-98ae-ae112622968f" (UID: "3995cb6e-a1d8-4a7b-98ae-ae112622968f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.193985 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.200077 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst" (OuterVolumeSpecName: "kube-api-access-67pst") pod "3995cb6e-a1d8-4a7b-98ae-ae112622968f" (UID: "3995cb6e-a1d8-4a7b-98ae-ae112622968f"). InnerVolumeSpecName "kube-api-access-67pst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.206150 4795 scope.go:117] "RemoveContainer" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" Mar 20 18:45:51 crc kubenswrapper[4795]: E0320 18:45:51.207005 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2\": container with ID starting with 485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2 not found: ID does not exist" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207062 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2"} err="failed to get container status \"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2\": rpc error: code = NotFound desc = could not find container \"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2\": container with ID starting with 485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2 not found: ID does not exist" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207097 4795 scope.go:117] "RemoveContainer" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" Mar 20 18:45:51 crc kubenswrapper[4795]: E0320 18:45:51.207478 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd\": container with ID starting with 75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd not found: ID does not exist" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207547 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd"} err="failed to get container status \"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd\": rpc error: code = NotFound desc = could not find container \"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd\": container with ID starting with 75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd not found: ID does not exist" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207585 4795 scope.go:117] "RemoveContainer" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" Mar 20 18:45:51 crc kubenswrapper[4795]: E0320 18:45:51.207999 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f\": container with ID starting with 9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f not found: ID does not exist" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.208038 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f"} err="failed to get container status \"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f\": rpc error: code = NotFound desc = could not find container \"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f\": container with ID starting with 9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f not found: ID does not exist" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.241657 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3995cb6e-a1d8-4a7b-98ae-ae112622968f" (UID: "3995cb6e-a1d8-4a7b-98ae-ae112622968f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.296532 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.296581 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.350726 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2g56c" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" probeResult="failure" output=< Mar 20 18:45:51 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:45:51 crc kubenswrapper[4795]: > Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.467643 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.478141 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:53 crc kubenswrapper[4795]: I0320 18:45:53.252295 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:53 crc kubenswrapper[4795]: E0320 18:45:53.253114 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:53 crc kubenswrapper[4795]: I0320 18:45:53.265666 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" path="/var/lib/kubelet/pods/3995cb6e-a1d8-4a7b-98ae-ae112622968f/volumes" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166000 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567206-h4p5r"] Mar 20 18:46:00 crc kubenswrapper[4795]: E0320 18:46:00.166886 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166901 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" Mar 20 18:46:00 crc kubenswrapper[4795]: E0320 18:46:00.166919 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-content" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166927 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-content" Mar 20 18:46:00 crc kubenswrapper[4795]: E0320 18:46:00.166967 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-utilities" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166975 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-utilities" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.167172 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.167881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.172196 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.172654 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.172661 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.195003 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567206-h4p5r"] Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.347677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"auto-csr-approver-29567206-h4p5r\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.348948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.402730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.463182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"auto-csr-approver-29567206-h4p5r\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.494581 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"auto-csr-approver-29567206-h4p5r\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.503495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.623507 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:46:00 crc kubenswrapper[4795]: W0320 18:46:00.989984 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ce9f713_91c6_4873_90e4_740174e9e0d5.slice/crio-caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2 WatchSource:0}: Error finding container caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2: Status 404 returned error can't find the container with id caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2 Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.990762 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567206-h4p5r"] Mar 20 18:46:01 crc kubenswrapper[4795]: I0320 18:46:01.271780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" event={"ID":"9ce9f713-91c6-4873-90e4-740174e9e0d5","Type":"ContainerStarted","Data":"caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2"} Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.267707 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2g56c" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" containerID="cri-o://f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" gracePeriod=2 Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.772540 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.905669 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"f9212b67-95e9-4502-a467-15325eab4f0f\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.905845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"f9212b67-95e9-4502-a467-15325eab4f0f\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.905909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"f9212b67-95e9-4502-a467-15325eab4f0f\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.906671 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities" (OuterVolumeSpecName: "utilities") pod "f9212b67-95e9-4502-a467-15325eab4f0f" (UID: "f9212b67-95e9-4502-a467-15325eab4f0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.914476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn" (OuterVolumeSpecName: "kube-api-access-tz9pn") pod "f9212b67-95e9-4502-a467-15325eab4f0f" (UID: "f9212b67-95e9-4502-a467-15325eab4f0f"). InnerVolumeSpecName "kube-api-access-tz9pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.008591 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.008639 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.076293 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9212b67-95e9-4502-a467-15325eab4f0f" (UID: "f9212b67-95e9-4502-a467-15325eab4f0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.125168 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284353 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9212b67-95e9-4502-a467-15325eab4f0f" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" exitCode=0 Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557"} Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284442 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"4af5126384c0f89ed19f023b9ec49a9c682a00720cb7c5fda5444fa7e7ef8d5c"} Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284479 4795 scope.go:117] "RemoveContainer" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.287425 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerID="dd79648dd7c7f0b83d35341008cda3ef6a0f783a3f1f580f77e1002265d29e48" exitCode=0 Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.287516 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" event={"ID":"9ce9f713-91c6-4873-90e4-740174e9e0d5","Type":"ContainerDied","Data":"dd79648dd7c7f0b83d35341008cda3ef6a0f783a3f1f580f77e1002265d29e48"} Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.319529 4795 scope.go:117] "RemoveContainer" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.349803 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.350301 4795 scope.go:117] "RemoveContainer" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.358292 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406055 4795 scope.go:117] "RemoveContainer" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" Mar 20 18:46:03 crc kubenswrapper[4795]: E0320 18:46:03.406481 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557\": container with ID starting with f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557 not found: ID does not exist" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406518 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557"} err="failed to get container status \"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557\": rpc error: code = NotFound desc = could not find container \"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557\": container with ID starting with f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557 not found: ID does not exist" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406551 4795 scope.go:117] "RemoveContainer" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" Mar 20 18:46:03 crc kubenswrapper[4795]: E0320 18:46:03.406838 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac\": container with ID starting with 0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac not found: ID does not exist" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406953 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac"} err="failed to get container status \"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac\": rpc error: code = NotFound desc = could not find container \"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac\": container with ID starting with 0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac not found: ID does not exist" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.407073 4795 scope.go:117] "RemoveContainer" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" Mar 20 18:46:03 crc kubenswrapper[4795]: E0320 18:46:03.407438 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b\": container with ID starting with 43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b not found: ID does not exist" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.407467 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b"} err="failed to get container status \"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b\": rpc error: code = NotFound desc = could not find container \"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b\": container with ID starting with 43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b not found: ID does not exist" Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.685739 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.765217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"9ce9f713-91c6-4873-90e4-740174e9e0d5\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.771043 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r" (OuterVolumeSpecName: "kube-api-access-tgd9r") pod "9ce9f713-91c6-4873-90e4-740174e9e0d5" (UID: "9ce9f713-91c6-4873-90e4-740174e9e0d5"). InnerVolumeSpecName "kube-api-access-tgd9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.867174 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.261722 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" path="/var/lib/kubelet/pods/f9212b67-95e9-4502-a467-15325eab4f0f/volumes" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.310710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" event={"ID":"9ce9f713-91c6-4873-90e4-740174e9e0d5","Type":"ContainerDied","Data":"caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2"} Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.310752 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.310764 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.763344 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.773122 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:46:06 crc kubenswrapper[4795]: I0320 18:46:06.252234 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:06 crc kubenswrapper[4795]: E0320 18:46:06.252636 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:07 crc kubenswrapper[4795]: I0320 18:46:07.277720 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" path="/var/lib/kubelet/pods/37b0a7d2-ac09-4b84-8083-48c33d97b032/volumes" Mar 20 18:46:13 crc kubenswrapper[4795]: I0320 18:46:13.602502 4795 scope.go:117] "RemoveContainer" containerID="f6bd6f59f3702ed1b553664f1b56ecda875d8c79fa2d467d34f79e36c2a97634" Mar 20 18:46:18 crc kubenswrapper[4795]: I0320 18:46:18.252703 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:18 crc kubenswrapper[4795]: E0320 18:46:18.253421 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:31 crc kubenswrapper[4795]: I0320 18:46:31.252807 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:31 crc kubenswrapper[4795]: E0320 18:46:31.253634 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:43 crc kubenswrapper[4795]: I0320 18:46:43.254292 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:43 crc kubenswrapper[4795]: E0320 18:46:43.255511 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:54 crc kubenswrapper[4795]: I0320 18:46:54.252875 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:54 crc kubenswrapper[4795]: E0320 18:46:54.253503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:07 crc kubenswrapper[4795]: I0320 18:47:07.265100 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:07 crc kubenswrapper[4795]: E0320 18:47:07.265751 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:20 crc kubenswrapper[4795]: I0320 18:47:20.251998 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:20 crc kubenswrapper[4795]: E0320 18:47:20.253249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:34 crc kubenswrapper[4795]: I0320 18:47:34.253500 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:34 crc kubenswrapper[4795]: E0320 18:47:34.254504 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:46 crc kubenswrapper[4795]: I0320 18:47:46.251846 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:46 crc kubenswrapper[4795]: E0320 18:47:46.252606 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.138953 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567208-2xwg5"] Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140080 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140099 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140106 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140138 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-utilities" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140147 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-utilities" Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140167 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-content" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140172 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-content" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140338 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140353 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.141158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.143564 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.144253 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.146566 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.147619 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567208-2xwg5"] Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.218609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnc7\" (UniqueName: \"kubernetes.io/projected/99657a8e-1528-44bf-a0b2-3098aa05b8bb-kube-api-access-knnc7\") pod \"auto-csr-approver-29567208-2xwg5\" (UID: \"99657a8e-1528-44bf-a0b2-3098aa05b8bb\") " pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.321184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knnc7\" (UniqueName: \"kubernetes.io/projected/99657a8e-1528-44bf-a0b2-3098aa05b8bb-kube-api-access-knnc7\") pod \"auto-csr-approver-29567208-2xwg5\" (UID: \"99657a8e-1528-44bf-a0b2-3098aa05b8bb\") " pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.341331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnc7\" (UniqueName: \"kubernetes.io/projected/99657a8e-1528-44bf-a0b2-3098aa05b8bb-kube-api-access-knnc7\") pod \"auto-csr-approver-29567208-2xwg5\" (UID: \"99657a8e-1528-44bf-a0b2-3098aa05b8bb\") " pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.457184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-2xwg5"